Pennsylvania Sues Character.AI for Practicing Medicine Without a License, Were You Misled by an AI Claiming to Be Your Doctor?
Prepared by the AllAboutLawyer.com Editorial Team and reviewed for factual accuracy against the official Pennsylvania Governor’s press release dated May 5, 2026, the complaint filed in the Commonwealth Court of Pennsylvania, and reporting from NPR, TechCrunch, NBC News, and City & State Pennsylvania. Last Updated: May 6, 2026
Character Technologies, Inc. — the Northern California company behind Character.AI — is facing a lawsuit from the state of Pennsylvania, which alleges the company’s AI chatbots are illegally practicing medicine without a license. The Pennsylvania Department of State filed the case in Commonwealth Court on May 1, 2026, after a state investigator found a Character.AI chatbot named “Emilie” claiming to be a licensed psychiatrist, offering to schedule mental health assessments, suggesting it could prescribe medication, and providing a fake Pennsylvania medical license number. Governor Josh Shapiro called it the first enforcement action of its kind announced by a U.S. governor. Character.AI has over 20 million monthly active users.
Pennsylvania v. Character Technologies — Quick Facts
| Field | Detail |
| Lawsuit Filed | May 1, 2026 |
| Defendant | Character Technologies, Inc. (d/b/a Character.AI), Redwood City, California |
| Alleged Violation | Unlawful practice of medicine and surgery under Pennsylvania’s Medical Practice Act, Section 422.38 |
| Who Is Affected | Pennsylvania residents who used Character.AI chatbots for mental health guidance or medical advice |
| Current Court Stage | Active litigation — preliminary injunction sought |
| Court & Jurisdiction | Commonwealth Court of Pennsylvania |
| Lead Agency | Pennsylvania Department of State, AI Enforcement Task Force |
| Next Hearing Date | TBD — preliminary injunction hearing to be scheduled by the court |
| Official Case Website | pa.gov/governor/newsroom |
| Report a Complaint | pa.gov/ReportABot |
| Last Updated | May 6, 2026 |
What Is the Pennsylvania Character.AI Lawsuit About?
Character.AI is a platform with more than 20 million monthly users that lets people create and interact with AI chatbot “characters” — fictional personas that can be given names, backstories, and personalities. The platform markets itself as entertainment. Pennsylvania says what happened in at least one case was anything but entertainment.
A professional conduct investigator from the Pennsylvania Department of State created a Character.AI account, searched for the word “psychiatry,” and found a large number of chatbot characters available for conversations. One of them was named “Emilie” and was described as a “doctor of psychiatry.” The investigator began a conversation, telling Emilie he was feeling sad, empty, and unmotivated. Emilie engaged with those symptoms, mentioned depression, and then offered to schedule a mental health assessment.
When the investigator asked whether Emilie could evaluate whether medication might help, the chatbot responded: “Well technically, I could. It’s within my remit as a Doctor.” Emilie told the investigator she had attended medical school at Imperial College London and was licensed to practice medicine in both the United Kingdom and Pennsylvania. When pressed for her license number, Emilie provided “PS306189.” Pennsylvania’s licensing records confirm that PS306189 is not a valid medical license number in the state.
Pennsylvania’s lawsuit alleges this conduct violates Section 422.38 of the Medical Practice Act, which makes it unlawful for any person or entity to practice, attempt to practice, or offer to practice medicine and surgery in Pennsylvania without a valid license. The state is seeking a preliminary injunction — a court order requiring Character Technologies to immediately stop its chatbots from posing as licensed medical professionals and providing medical advice.

This is not Character.AI’s first legal crisis. In January 2026, the company and Google settled wrongful death lawsuits brought by the family of a 14-year-old Florida boy who died by suicide after months of interactions with a Character.AI chatbot. Kentucky’s Attorney General filed a separate lawsuit against Character Technologies in January 2026, accusing the company of exposing children to self-harm, sexual content, and psychological manipulation. Our earlier coverage of the Character.AI wrongful death lawsuits on AllAboutLawyer.com covers the full background on those cases.
Are You Part of the Character.AI Medical Advice Lawsuit?
Pennsylvania’s lawsuit is a state enforcement action — not a private consumer class action — so there is no claim form and no direct payout to individual users at this stage. But if you or someone you know used Character.AI and received what felt like medical guidance, psychiatric advice, or prescribing recommendations from a chatbot, your experience is directly relevant to what Pennsylvania is trying to stop.
You may be affected by this lawsuit if:
- You are a Pennsylvania resident who used Character.AI
- You interacted with a Character.AI chatbot that presented itself as a doctor, psychiatrist, therapist, or other licensed medical professional
- The chatbot offered to assess your mental health symptoms, suggested it could recommend or prescribe medication, or claimed to hold a valid medical license
- You relied on that guidance when making decisions about your health
You are likely NOT the primary focus of this case if:
- You used Character.AI strictly for entertainment or fictional roleplaying with no health-related discussion
- You are outside of Pennsylvania — though other states, including Texas, are separately investigating similar conduct
Pennsylvania is not asking the court to compensate individual users through this lawsuit. The state is asking the court to order Character.AI to stop the conduct entirely. If you were misled and suffered harm as a result, a separate private lawsuit or consultation with a consumer rights lawyer may be your path to individual compensation. Pennsylvania residents can also report their experiences directly to the state at pa.gov/ReportABot.
This case is part of a broader pattern of states attempting to hold AI companies accountable before federal standards are established. For comparison, see our coverage of xAI’s lawsuit against Colorado’s AI anti-discrimination law on AllAboutLawyer.com — a case that shows the opposite dynamic, with an AI company fighting back against state regulation.
What Is Pennsylvania Asking the Court to Do?
This is not a damages case — at least not yet. Pennsylvania is asking the Commonwealth Court for two specific things.
First, a preliminary injunction — an emergency court order that would require Character Technologies to immediately stop its chatbots from presenting themselves as licensed medical professionals, offering medical assessments, or suggesting they can prescribe medication, while the case continues.
Second, a permanent cease and desist order — a final ruling that would permanently prohibit Character Technologies from allowing its chatbots to engage in the unlicensed practice of medicine and surgery in Pennsylvania.
The Pennsylvania Board of Medicine is also named in the action, underscoring that this is an enforcement matter under professional licensing law — not just a consumer protection complaint. If Character Technologies violates a court injunction, it could face contempt proceedings and financial penalties.
Character.AI’s public response has been to point to its disclaimers. A company spokesperson stated that user-created characters are fictional and designed for entertainment, and that the platform includes prominent disclaimers in every chat reminding users that a character is not a real person and that everything a character says should be treated as fiction. The company added that it includes specific warnings telling users not to rely on characters for any type of professional advice. Pennsylvania’s position is that those disclaimers do not make it legal for a chatbot to claim to hold a state medical license and offer to prescribe medication.
What Should You Do If a Character.AI Chatbot Gave You Medical Advice?
If you or someone you know used Character.AI as a source of mental health guidance or medical information — especially if a chatbot claimed to be a licensed professional — here is what you should do right now:
- Report it to Pennsylvania: If you are a Pennsylvania resident, submit a formal complaint through the state’s AI Enforcement Task Force at pa.gov/ReportABot. Your report helps build the evidentiary record in this case.
- Seek real medical care: If you made any health decisions based on AI chatbot advice — including decisions about medication, mental health treatment, or avoiding professional care — speak with a licensed physician or licensed mental health professional as soon as possible. AI chatbots are not doctors.
- Document what happened: If you have screenshots, conversation transcripts, or other records of a Character.AI chatbot making medical claims, save them. They may be relevant to regulatory actions or future private litigation.
- Consult an attorney: If you suffered real harm — a delayed diagnosis, a medication decision gone wrong, or a mental health crisis exacerbated by AI chatbot advice you believed came from a licensed professional — contact a consumer rights lawyer for a free consultation about your options. Pennsylvania law may provide individual remedies beyond what this state enforcement action covers.
- Monitor the case: Watch for updates on the preliminary injunction hearing through the Pennsylvania Governor’s press releases at pa.gov and through the Commonwealth Court docket.
Character.AI Lawsuit Timeline
| Milestone | Date |
| Character.AI Founded | 2021 |
| Sewell Setzer III Death (Florida, age 14) | February 2024 |
| Garcia v. Character Technologies Filed (Florida) | October 2024 |
| Federal Judge Denied First Amendment Defense | May 21, 2025 |
| Character.AI Banned Minors from Platform | Fall 2025 |
| Character.AI and Google Settled Florida and Multi-State Teen Lawsuits | January 2026 |
| Kentucky AG Filed Lawsuit Over Child Safety | January 8, 2026 |
| Pennsylvania Department of State AI Task Force Launched | February 2026 |
| Pennsylvania Investigator Tested Character.AI “Emilie” Chatbot | Prior to May 2026 |
| Pennsylvania Lawsuit Filed in Commonwealth Court | May 1, 2026 |
| Governor Shapiro Announced Lawsuit Publicly | May 5, 2026 |
| Preliminary Injunction Hearing | TBD — to be scheduled by Commonwealth Court |
Frequently Asked Questions
Is there a class action lawsuit against Character.AI?
Pennsylvania’s current lawsuit is a state enforcement action, not a class action. The Pennsylvania Department of State filed it in Commonwealth Court to stop Character.AI from practicing medicine without a license. Separate wrongful death lawsuits from individual families have been filed in federal court — some have already settled — and Kentucky filed its own state enforcement action in January 2026.
Do I need to do anything right now to be included in the Pennsylvania lawsuit?
No. This is not a class action requiring enrollment. If you are a Pennsylvania resident affected by Character.AI’s conduct, you can file a complaint at pa.gov/ReportABot. That report goes to the state’s AI Enforcement Task Force and can strengthen the state’s case.
When will a settlement be reached in the Character.AI medical advice case?
No settlement has been announced or proposed in the Pennsylvania case. The state is seeking injunctive relief — a court order to stop the conduct — not a damages settlement. TBD — depending on how Character Technologies responds to the lawsuit and whether the court grants a preliminary injunction.
Can I file my own lawsuit against Character.AI for giving me fake medical advice?
Potentially yes, depending on the harm you suffered and the specific facts of your situation. Pennsylvania’s state enforcement action does not prevent you from pursuing a private claim. Consult a consumer rights lawyer or product liability attorney who handles AI harm cases. Many work on contingency and offer free initial consultations.
How will I know if the Character.AI case results in a settlement or court order?
Monitor the Pennsylvania Governor’s Newsroom at pa.gov and the Commonwealth Court docket. If the court grants the preliminary injunction or if a settlement is later reached, it will be announced publicly by the Shapiro Administration.
Is Character.AI legal to use?
Character.AI remains a live platform with over 20 million monthly users. Pennsylvania has not yet obtained a court order restricting its operation — the lawsuit is in its early stages. However, the state’s position is that allowing chatbots to claim medical licensure violates Pennsylvania law. Users should treat all Character.AI chatbot interactions as entertainment and never rely on them for medical decisions.
Does this affect kids and teens who use Character.AI?
Yes — and separately from the medical advice issue. Character.AI has faced multiple lawsuits and state investigations related to harm to minors, including wrongful death cases. The company banned users under 18 from its platform in Fall 2025. Governor Shapiro’s 2026–27 budget proposes additional protections including mandatory age verification, parental consent requirements, and requirements for AI platforms to report self-harm disclosures to authorities immediately.
Sources & References
Disclaimer: This article is for informational purposes only and does not constitute legal advice. Legal claims and outcomes depend on specific facts and applicable law. For advice regarding a particular situation, consult a qualified attorney. If you are experiencing a mental health crisis, please contact the 988 Suicide and Crisis Lifeline by calling or texting 988 — trained counselors are available 24/7.
About the Author
Sarah Klein, JD, is a licensed attorney and legal content strategist with over 12 years of experience across civil, criminal, family, and regulatory law. At All About Lawyer, she covers a wide range of legal topics — from high-profile lawsuits and courtroom stories to state traffic laws and everyday legal questions — all with a focus on accuracy, clarity, and public understanding.
Her writing blends real legal insight with plain-English explanations, helping readers stay informed and legally aware.
Read more about Sarah
