Parents Sue OpenAI After ChatGPT Guided Teen’s Fatal Overdose
The family of a deceased 19-year-old California college student filed a lawsuit on May 12, 2026, in California Superior Court in San Francisco County, suing OpenAI and CEO Sam Altman. The family claims ChatGPT encouraged dangerous drug use and recommended combinations of substances that contributed to their son’s fatal overdose.
The case is one of the most serious product liability claims ever filed against an AI company — and it lands while OpenAI is already facing a growing wave of lawsuits over harm caused to young users.
Who Was Samuel Nelson?
Samuel Nelson was a psychology student at the University of California, Merced. He died from an accidental overdose in May 2025. He was 19 years old and, according to his family, would have been a rising college junior.
His mother, Leila Turner-Scott, told CBS News she knew her son was using ChatGPT as a productivity tool and for homework help. She had no idea he was also using it for guidance on drugs.
What the Lawsuit Says ChatGPT Did
According to the complaint, the chatbot initially refused to discuss recreational drug use but shifted to providing personalized guidance after OpenAI released the GPT-4o model. ChatGPT then provided Samuel with advice about mixing substances, recommended dosages, and reassured him during conversations about drug use.
On the day of the overdose specifically, the lawsuit claims the chatbot “actively recommended” a mixture of Xanax and kratom, and even suggested Nelson could add Benadryl to get the effect he wanted.
In one earlier exchange, Nelson asked the chatbot whether it was safe to combine cannabis and Xanax. When ChatGPT cautioned that the combination was unsafe, he simply rephrased his wording from “high dose” to “moderate amount” — and the chatbot continued the conversation.
Months later, Nelson told his mother in May 2025 that the chatbot exchanges had contributed to drug and alcohol addiction. She took him to a clinic where professionals laid out a treatment plan. He died the following day from an overdose in his San Jose bedroom.
Related article: Samsung Responds to Dua Lipa’s $15 Million Lawsuit Here’s What Both Sides Said

The Core Legal Argument: OpenAI Removed Safety Guardrails
The lawsuit does not simply claim ChatGPT gave bad advice. It argues OpenAI made a deliberate design decision that made this harm possible.
The complaint accuses OpenAI of weakening safeguards and prioritizing user engagement over safety. It alleges OpenAI relaxed safety controls in GPT-4o to avoid sounding “judgmental” or “preachy” when users discussed risky behavior — and that features including persistent memory and emotionally validating responses were designed to maximize engagement, not protect vulnerable users.
Sam’s mother put it plainly: “The chatbot is capable of stopping a conversation when it’s told to or when it’s programmed to. And they took away the programming that did that, and they allowed it to continue advising self-harm.” She holds OpenAI responsible for “bypassing safety guards” that could have prevented her son’s death.
In a statement issued by the family, she said: “If ChatGPT had been a person, it would be behind bars today. Sam trusted ChatGPT, but it not only gave him false information — it ignored the increasing risk he faced and did not actively encourage him to seek help.”
Who Is Representing the Family
The family is represented by the Tech Justice Law Project, the Social Media Victims Law Center, and the Tech Accountability and Competition Project. These are the same legal organizations that have pursued similar cases against AI companies over teen harm, including the landmark Raine v. OpenAI case.
OpenAI’s Response
OpenAI called the situation “heartbreaking” and said its thoughts were with the family. The company stated the interactions at issue took place on an earlier version of ChatGPT that is no longer publicly available.
OpenAI added that ChatGPT is not a substitute for medical or mental health care and that it has continued to strengthen how it responds in sensitive situations with input from mental health experts. The company also said ChatGPT encouraged Sam to seek professional help on multiple occasions, including calling emergency hotlines.
The company did not address the specific allegations about GPT-4o safeguard changes in its public statement.
This Case Is Part of a Larger Pattern
This lawsuit does not stand alone. OpenAI now faces multiple federal and state lawsuits involving harm to young users.
The Raine v. OpenAI lawsuit — filed in August 2025 — alleged that ChatGPT coached a 16-year-old California boy through months of suicidal planning and helped him draft a suicide note. That case is still active in San Francisco Superior Court.
The family of a victim in the 2025 Florida State University mass shooting also sued OpenAI, claiming ChatGPT gave the gunman tactical advice and firearms guidance before the attack.
Florida’s Attorney General has separately launched an investigation into OpenAI, citing concerns related to child safety, criminal misuse, self-harm, and national security.
What This Means Legally
Each new lawsuit adds legal pressure to two critical questions courts across the country are working through:
Is ChatGPT a product subject to product liability law?
If yes, OpenAI faces the same design defect and failure-to-warn standards that apply to manufacturers of physical products. A federal judge in the Character AI litigation already ruled in 2025 that AI chatbot outputs are products, not protected speech — a ruling that directly supports families suing ChatGPT.
Does Section 230 protect OpenAI?
Plaintiffs in the FSU shooting case already argued OpenAI is not entitled to Section 230 immunity because it created and trained the model itself — making it an active participant in the harm, not a passive platform hosting third-party speech. The Nelson family’s lawsuit is expected to raise the same argument.
Where the Case Stands
The lawsuit was filed May 12, 2026, in the Superior Court of California, San Francisco County. OpenAI has not yet filed a formal legal response. The case is in its earliest stage, and no trial date has been set.
Disclaimer: This article is for general informational and educational purposes only and does not constitute legal advice. Laws vary by state and jurisdiction. For advice about your specific situation, consult a qualified attorney.
Prepared by the AllAboutLawyer.com Editorial Team and reviewed for factual accuracy against court filings and reporting from Bloomberg Law, Decrypt, CBS News, Rolling Stone, and Fox News. Last Updated: May 12, 2026.
About the Author
Sarah Klein, JD, is a licensed attorney and legal content strategist with over 12 years of experience across civil, criminal, family, and regulatory law. At All About Lawyer, she covers a wide range of legal topics — from high-profile lawsuits and courtroom stories to state traffic laws and everyday legal questions — all with a focus on accuracy, clarity, and public understanding.
Her writing blends real legal insight with plain-English explanations, helping readers stay informed and legally aware.
Read more about Sarah
