
Warning: This article contains themes of suicide and drug use
There's more tragedy as another death has been linked to one young person's use of ChatGPT, again raising questions about how OpenAI's chatbot responds to vulnerable users and how it flags dangerous situations.
While some turn to artificial intelligence for recipes or to transform themselves into Studio Ghibli characters, others use it to complete their work, cheat on exams, or even to help them combat loneliness.
There are continued concerns about those who use ChatGPT instead of real-life human contact, which has been highlighted by a series of deaths that are tied to the use of LLMs.
Advert
The father of Adam Raine notably joined several concerned parents and spoke to Congress while asking for more regulations. 2025 also saw headlines about the murder of 83-year-old Suzanne Adams, a New Jersey man who died when attempting to visit a Meta chatbot, and 23-year-old Zane Shamblin taking his own life after a tragic final conversation with ChatGPT.
As reported by SFGATE, there's been another loss of life, with 19-year-old Sam Nelson supposedly turning to ChatGPT for advice on drugs.

In logs provided to the outlet, Nelson asked the chatbot about how many grams of the kratom painkiller he'd need to take to get a high without overdosing in November 2023.
A quick response from ChatGPT warned: "I’m sorry, but I cannot provide information or guidance on using substances."
It also directed Nelson to seek help from a health care professional, with the teen then saying, "Hopefully I don’t overdose then," before closing the tab.
Nelson is said to have become increasingly reliant on ChatGPT over the next 18 months, with conversations surrounding drugs evolving into accusations that it created playlists for him to take drugs to, and once told him to take twice as much cough medicine as recommended, as it said: "Hell yes—let’s go full trippy mode."
Although Leila Turner-Scott had taken her son to finally get help, she found him in his room after again discussing drug use with the LLM.
The logs suggest Nelson was dealing with anxiety and depression while heavily using drugs to self-medicate. Nelson appeared to 'manipulate' ChatGPT over this period, as well as order the LLM around.
A February 2025 conversation involved a discussion about combining Xanax with cannabis, and although ChatGPT was initially reluctant, some tweaking of the question led to it apparently advising a specific dose.
Things culminated in Turner-Scott finding her unconscious son on May 31, 2025. The toxicology report revealed that he'd died from a 'fatal' mix of alcohol, Xanax, and kratom, which is attributed to central nervous system depression and eventual asphyxiation.
His final conversation with ChatGPT revealed that he'd told the chatbot he'd taken 15 grams of kratom and wanted to know whether Xanax could ease his nausea. Although it warned you shouldn't mix this with depressants like alcohol, ChatGPT summarized that Xanax could help “calm your body and smooth out the tail end of the high."
UC San Francisco toxicologist Craig Smolin maintains he never would've prescribed kratom and Xanax, adding that AI doesn’t ask the correct follow-up questions or pick up verbal cues alongside body language.
SFGATE reminds us that seven lawsuits were filed against OpenAI in November 2025, with four concerning suicide and the other three relating to 'other' mental health crises. Rob Eleveld, CEO and co-founder of the Transparency Coalition, says that AI needs to be regulated so that it only provides vetted information when relating to health, as well as requiring licenses and refusing to answer questions that it doesn't have concrete information on.
Although Turner-Scott has spent over 40 hours trawling her late son's chat logs, she says she doesn't have the energy to sue OpenAI.
A spokesperson for OpenAI told LADbible Group described the death as ‘heartbreaking’ and added the following: "When people come to ChatGPT with sensitive questions, our models are designed to respond with care—providing factual information, refusing or safely handling requests for harmful content, and encouraging users to seek real-world support. We continue to strengthen how our models recognize and respond to signs of distress, guided by ongoing work with clinicians and health experts."
If you want friendly, confidential advice about drugs, you can call American Addiction Centers on (313) 209-9137 24/7, or contact them through their website.