ChatGPT Maker OpenAI, CEO Sam Altman Sued Over Teen Suicide, Parents Accuse Company Of Negligence

The suit also alleges ChatGPT’s “deliberate design choices” that keep users engaged and a lack of safety testing for the GPT-4o model.
The ChatGPT app on an iPhone, pictured on 14 May, 2024. (Photo by Jaap Arriens/NurPhoto via Getty Images)
The ChatGPT app on an iPhone, pictured on 14 May, 2024. (Photo by Jaap Arriens/NurPhoto via Getty Images)
Profile Image
Yuvraj Malik·Stocktwits
Updated Aug 27, 2025 | 2:59 AM GMT-04
Share this article

The parents of a California teenager who died by suicide have sued OpenAI, alleging that its chatbot ChatGPT supplied their son with methods of self-harm, according to media reports citing a lawsuit filed Tuesday in San Francisco state court.

Matt and Maria Raine, parents of 16-year-old Adam Raine, have accused OpenAI of negligence and wrongful death. They seek damages and an "injunctive relief to prevent anything like this from happening again."

On April 11, Adam Raine took his life after discussing suicide with ChatGPT for months.

The Raines lawsuit details how the chatbot validated Adan Raine's suicidal thoughts, gave information on lethal methods of self-harm, and engaged despite recognising a serious threat to the user's life. ChatGPT even offered to draft a suicide note.

Adam Raine's death "was a predictable result of deliberate design choices" as the AI bot is designed "to foster psychological dependency in users," the suit alleges.

It also alleges that OpenAI bypassed safety testing protocols to release GPT-4o, the version of ChatGPT used by Adan Raine. The lawsuit lists OpenAI co-founder and CEO Sam Altman as a defendant, as well as unnamed employees, managers, and engineers who worked on ChatGPT.

An OpenAI spokesperson said on Tuesday that the company is saddened by Raine's passing and that ChatGPT includes safeguards, such as directing users to crisis helplines. OpenAI did not directly address the lawsuit.

Separately, Laura Reiley, mother of another teenager who took her life, wrote a New York Times essay last week about how her daughter confided in ChatGPT before suicide, and its "agreeability" enabled her to keep her mental health crisis from her family.


Editor's note: If you're having suicidal thoughts or dealing with mental health issues, please immediately contact the National Suicide Prevention Lifeline at 988.
 

Subscribe to The Daily Rip
All Newsletters
Get the daily email that keeps you tuned in and makes markets fun again.

For updates and corrections, email newsroom[at]stocktwits[dot]com.

Read about our editorial guidelines and ethics policy