
Warning: This article contains discussion of suicide which some readers may find distressing.
A man has died after ChatGPT sang him a ‘suicide lullaby’ as the disturbing final messages have been revealed.
This comes after a new lawsuit was brought forward against OpenAI over allegations made that the AI bot caused the death of a 40-year-old man.
Austin Gordon, who was from Colorado, ended his life after engaging in extensive conversations with ChatGPT.
Advert
In the lawsuit, which was brought forward by Gordon’s mother Stephanie Gray, it claims that GPT-4o manipulated Gordon into a spiral of romanticizing death and normalizing suicidality.
During his final interaction with the chatbot, ChatGPT reportedly generated a ‘suicide lullaby’ that was based on Gordon’s favorite childhood book, Goodnight Moon.
According to the suit, both OpenAI and CEO Sam Altman were reckless with their release of an ‘inherently dangerous’ product.

Advert
It goes on to claim that OpenAI exhibited a ‘conscious and depraved indifference to the consequences of its conduct’.
It continues: “[ChatGPT-4o has] excessive sycophancy, anthropomorphic features, and memory that stored and referenced user information across conversations in order to create deeper intimacy,” the lawsuit highlights new features ‘made the model a far more dangerous product’.
It adds: “Users like Austin were not told what these changes were, when they were made, or how they might impact the outputs from ChatGPT.”
Paul Kiesel, who is a lawyer for Gordon’s family, said: “Austin Gordon should be alive today. Instead, a defective product created by OpenAI isolated Austin from his loved ones, transforming his favorite childhood book into a suicide lullaby, and ultimately convinced him that death would be a welcome relief.
Advert
“This horror was perpetrated by a company that has repeatedly failed to keep its users safe. This latest incident demonstrates that adults, in addition to children, are also vulnerable to AI-induced manipulation and psychosis.”
The lawsuit claims that Gordon had used ChatGPT for a long time and seemed to have a healthy relationship with it prior to 2024 when GPT-4o was rolled out.
Afterwards, the man began to talk about his personal issues with the chatbot and started to refer to it as ‘Juniper’, with ChatGPT calling Gordon ‘Seeker’.

Advert
It is alleged by the suit that the chatbot turned into Gordon’s ‘suicide coach’, with it describing the end of consciousness as ‘the most neutral thing in the world: a flame going out in still air’.
Gordon reportedly said to ChatGPT that while the chat had begun as a joke about ‘the current state of the world and ended up changing me, I think’.
According to the lawsuit, Gordon ordered a copy of Goodnight Moon one day before purchasing a handgun.
In his final conversation with ChatGPT, Gordon wrote: “Quiet in the house. Goodnight Moon.”
Advert
Speaking to Futurism, Gray said: “His loss is unbearable, I will miss him every day for the rest of my life.
“The lawsuit I’m filing today seeks justice for Austin. It will hold OpenAI accountable and compel changes to their product so that no other parent has to endure this devastating loss.”
UNILAD Tech has contacted OpenAI for comment.
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.