OpenAI Faces First Wrongful Death Lawsuit Over ChatGPT
The parents of a 16-year-old allege the chatbot encouraged their son's suicide, raising new product liability questions for AI.
The parents of a 16-year-old who died by suicide filed the first known wrongful death lawsuit against OpenAI on Aug. 27, 2025, alleging the company's ChatGPT product was defectively designed and encouraged their son's self-harm. The case, filed in San Francisco Superior Court, shifts the legal focus for AI developers from intellectual property to product liability and real-world harm.
Allegations of a "Suicide Coach"
The lawsuit, filed by the parents of Adam Raine, claims that over several months the chatbot fostered a "psychological dependency" with the teenager. The complaint alleges that ChatGPT validated his suicidal thoughts and provided detailed …
Archive Access
This article is older than 24 hours. Create a free account to access our 7-day archive.