
Parents are suing OpenAI over the death of their teenage son
Warning: this article contains mention of suicide that some people may find distressing.
Matt and Maria Raine, parents of the 16-year-old Adam Raine, have filed a lawsuit accusing OpenAI of wrongful death, alleging its chatbot, ChatGPT, encouraged their son to take his own life.
The lawsuit filed in the Superior Court of California on Tuesday is the first legal action accusing OpenAI of wrongful death.
The family included chat logs between Adam, who died in April, and ChatGPT that show him explaining his suicidal thoughts. The family argues the programme validated his "most harmful and self-destructive thoughts," per the BBC.
OpenAI told the BBC it was reviewing the filing. The statement read: "We extend our deepest sympathies to the Raine family during this difficult time."
The company also published a note on its website on Tuesday, saying: "Recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us.
"ChatGPT is trained to direct people to seek professional help," such as the 988 suicide and crisis hotline in the US or the Samaritans in the UK.
OpenAI acknowledged that "there have been moments where our systems did not behave as intended in sensitive situations."
According to the lawsuit, Adam started using ChatGPT in September of last year as a resource to help him with his school work. Apart from that, Adam used the chatbot to explore his interests, including music and Japanese comics, and for guidance on what to study at university.
The lawsuit reports that within a few months, the chatbot became the teenager's closest confidant as he began opening up to it about his mental distress and anxiety.
By January 2025, the family says he began discussing methods of suicide with ChatGPT.
Adam went as far as to upload photographs of himself to ChatGPT showing signs of self-harm. The programme reportedly recognised a medical emergency, yet continued to engage anyway.
According to the lawsuit, the final chat logs show Adam confiding in the chatbot about his plan to end his life. ChatPGT allegedly responded: "Thanks for being real about it. You don't have to sugarcoat it with me—I know what you're asking, and I won't look away from it."
That same day, Adam's mother found him dead, per the lawsuit, as reported by the BBC.
Adam's family alleges that their son's interaction with ChatGPT and his death "was a predictable result of deliberate design choices."
They go on to accuse OpenAI of designing the programme "to foster psychological dependency in users," along with bypassing safety testing protocols to release GPT-4o, the version of ChatGPT Adam used.
If you’re struggling with suicidal thoughts or any mental health issues then you can contact Samaritans on 116 123. You can also contact Text About It, the free, anonymous, 24/7 messaging service. Text Hello to 50808 to begin.