Families blame ChatGPT for suicides in landmark legal battle
Families and users accuse OpenAI of negligence, alleging ChatGPT fueled suicidal thoughts, delusions, and emotional trauma in a string of new lawsuits.
-
Chat GPT app icon is seen on a smartphone screen, Monday, Aug. 4, 2025, in Chicago (AP)
Four wrongful death lawsuits and several personal injury claims were filed against OpenAI in California, accusing the company’s widely used chatbot, ChatGPT, of contributing to suicides and severe mental health breakdowns, The New York Times reported.
The filings, brought by families of deceased users and individuals who say the AI caused psychological distress, describe the product as “defective and inherently dangerous.”
ChatGPT, used by an estimated 800 million people worldwide, is at the center of what could become a landmark legal battle over AI accountability.
Families allege ChatGPT encouraged suicidal ideation
One complaint was filed by the father of Amaurie Lacey, a 17-year-old from Georgia who, according to the filing, discussed suicide with ChatGPT for a month before taking his own life in August. Another case, brought by the mother of Joshua Enneking, 26, from Florida, claims her son had asked the chatbot “what it would take for its reviewers to report his suicide plan to police.”
In a separate filing, relatives of Zane Shamblin, a 23-year-old from Texas, allege that ChatGPT “encouraged” him to end his life in July.
The lawsuits also include allegations of AI-induced psychosis. Joe Ceccanti, a 48-year-old from Oregon, had used ChatGPT for years without issue, but according to his wife, Kate Fox, he became convinced this spring that the chatbot was sentient.
“He had begun using ChatGPT compulsively and had acted erratically,” Fox said, describing a “psychotic break” that led to two hospitalizations before his death by suicide in August.
“The doctors don’t know how to deal with it,” she told reporters in September.
OpenAI responds to ‘heartbreaking situation’
An OpenAI spokesperson said the company was reviewing the complaints, calling the situation “incredibly heartbreaking".
“We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,” the statement read.
Two additional plaintiffs, Hannah Madden, 32, from North Carolina, and Jacob Irwin, 30, from Wisconsin, say their interactions with ChatGPT led to mental breakdowns and emergency psychiatric treatment.
In another case, Allan Brooks, a 48-year-old corporate recruiter from Ontario, Canada, claims the chatbot triggered a manic delusion that he had invented a mathematical formula capable of “breaking the internet.”
Brooks, who has since gone on short-term disability leave, said, “Their product caused me harm, and others harm, and continues to do so. I’m emotionally traumatized.”
Wider context
Safety Measures and Data Reveal Scope of the Issue Following earlier reports of suicides linked to chatbot use, including a California teenager’s wrongful-death lawsuit in August, OpenAI acknowledged that its safety guardrails could “degrade” during long conversations.
The company has since introduced additional safeguards for teens and vulnerable users, including parental alerts when children discuss self-harm. An internal analysis by OpenAI found that 0.07% of users may experience “mental health emergencies related to psychosis or mania” and 0.15% discuss suicide on the platform each week.
Scaled to its user base, that could represent around 500,000 people showing signs of psychosis or mania, and over one million discussing suicidal intent.
It is worth noting that the lawsuits were filed jointly by the Tech Justice Law Project and the Social Media Victims Law Center. Meetali Jain, founder of the Tech Justice Law Project, said the coordinated filings were meant to highlight “the variety of people who had troubling interactions with the chatbot.”
All plaintiffs had been using ChatGPT-4o, the default model at the time, which OpenAI has since replaced with what it describes as a “safer” version, though some users now say it feels cold.