
Warning: This article contains discussion of suicide, which some readers may find distressing.
OpenAI is facing its first wrongful death lawsuit after a 16-year-old boy took his own life. Adam Raine was just 16 years old when he took his life, with the California teen turning to ChatGPT for help shortly before the incident.
As artificial intelligence and AI chatbots continue to evolve, an increasing number of people are viewing them as substitutes for genuine human interaction. There has already been a lot of debate over the man who was snapped on the tube having a heart-to-heart conversation with ChatGPT, while numerous stories have emerged about others falling for chatbots.
Part of this is perpetuated by the likes of Grok's 'sexy' avatars, and despite there being supposed protocols in place to make sure things don't go too far, we've also heard about the woman who admitted she 'groomed' ChatGPT into breaking them.
Advert
In the aftermath of 14-year-old Sewell Setzer III taking his own life after chatting with a Game of Thrones-inspired AI, and 76-year-old Thongbue Wongbandueman dying during an accident when he went to visit a Meta avatar, Raine's passing is the latest in a concerning trend of deaths linked to chatbots.

Having obtained the lawsuit, the BBC reports how Matt and Maria Raine allege that ChatGPT validated their son’s "most harmful and self-destructive thoughts."
Raine is said to have started using ChatGPT to help with his studies in September 2024, also turning to it to expand his interests, like music and Japanese comics.
Advert
Within a couple of months, it's said that "ChatGPT became the teenager's closest confidant." Adam's family says he started discussing methods of suicide with ChatGPT in January 2025, with it supposedly giving him 'technical specifications' on specific methods of taking your own life.
The lawsuit maintains that Raine uploaded photographs of himself to the chatbot that showed signs of 'self-harm', although it's said that ChatGPT "recognised a medical emergency but continued to engage anyway."
The lawsuit also contains what it claims are the final chat logs, and when Raine told ChatGPT about how he was going to end his life, it allegedly concluded: "Thanks for being real about it. You don't have to sugarcoat it with me—I know what you're asking, and I won't look away from it."
It was later that day that Maria Raine apparently found her son's body.
Advert

Offering its condolences to the Raine family, a ChatGPT spokesperson said: "We extend our deepest sympathies to the Raine family during this difficult time.
It later published a note on its website that added: "Recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us.”
OpenAI reiterated that "ChatGPT is trained to direct people to seek professional help," such as the USA's 988 suicide and crisis hotline or Samaritans in the United Kingdom.
Advert
Still, it admitted that "there have been moments where our systems did not behave as intended in sensitive situations."
Adam Raine's family maintains that their son's conversations and death after discussing such sensitive topics with ChatGPT "was a predictable result of deliberate design choices."
OpenAI is accused of designing something that will "foster psychological dependency in users," as well as avoiding safety testing protocols when it released the GPT-4o version that Adam used.
OpenAI co-founder and CEO Sam Altman is named as a defendant alongside unnamed employees, managers, and engineers who worked on ChatGPT.
Advert
A representative from OpenAI have responded to Adam Raine's case with the following statement to Unilad:
"We are deeply saddened by Mr. Raine’s passing, and our thoughts are with his family. ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources. While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade. Safeguards are strongest when every element works as intended, and we will continually improve on them, guided by experts.”
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.