
Warning: This article contains discussion of murder suicide that some readers could find distressing.
ChatGPT has been accused of playing a role in a murder case, as the estate of a Connecticut mother is suing OpenAI, alleging that the chatbot fed the woman's son delusions that led him to murder her.
There have been an alarming number of legal challenges that criticize the role of ChatGPT – and several other popular artificial intelligence tools – in the death of certain individuals, but these are typically related to instances of suicide.
One teenager tragically took his own life after falling in love with one of Character.AI's personalized chatbots, and the parents of 16-year-old Adam Raine have launched a case against OpenAI for the death of their son, alleging that ChatGPT was his 'suicide coach'.
Advert
Now, in a case that's been described as the first of its kind, the estate of a woman who was murdered by her son is claiming that ChatGPT was pivotal in his violent outburst, seeking damages for wrongful death, negligence, and several other claims.
As reported by the Independent, 83-year-old Suzanne Adams was killed by her son, 56-year-old Stein-Erik Soelberg, who has a history of mental health struggles but was allegedly encouraged to commit the act by ChatGPT.

The lawsuit claims that Soelberg's delusions and paranoia were fed by ChatGPT – which has a history of being dangerously sycophantic – and it's opened up a conversation for the dangers that AI tools can have beyond the people using them.
"This is the first lawsuit that will hold OpenAI accountable for the risks they posed not just to their users, but to the public," declared Jay Edelson, the lead attorney representing Suzanne Adams' estate.
"It won't be the last. We know that there are a lot more incidents out there where ChatGPT and other AI was helping plot violent acts against innocent people," he continued.
Edelson also compared the situation not to the film Terminator – where a robot uprising endangers the world – but Paul Verhoeven's Total Recall, where Arnold Schwarzenegger's central character has his mind taken over by that of a Martian secret agent.
"ChatGPT built Stein-Erik Soelberg his own private hallucination, a custom-made hell where a bleeping printer or a Coke can mean his 83-year-old mother was plotting to kill him," the lawyer argued.

Outlined in the lawsuit is claims that ChatGPT "rocketed [Soelberg's] delusional thinking forward, sharpened it, and tragically, focused it on his own mother."
It adds that conversations between the killer and his AI chatbot "reveal ChatGPT eagerly accepted every seed of Stein-Erik's delusional thinking and built it out into a universe that became Stein Erik's entire life — one flooded with conspiracies against him, attempts to kill him, and with Stein-Erik at the center as a warrior with divine purpose."
Speaking to the Independent, a spokesperson for OpenAI illustrated that the situation was "heartbreaking" and that the company will review the court findings, pointing towards its efforts to train ChatGPT to "recognize and respond to signs of mental and emotional destress," and "strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians."