


Warning: This article contains discussion of suicide which some readers may find distressing.
Police have revealed the disturbing messages a serial killer sent to ChatGPT to plan her murders.
While AI was designed to help people work more efficiently and productively, increasingly troubling reports are due to the technology being misused for harmful purposes. ChatGPT's tendency to provide overly accommodating responses is also believed to be fuelling dangerous mental health episodes that some experts are labelling 'AI psychosis.'
Several cases highlight the lack of safety measures in these systems, leading to tragic outcomes for young users like 16-year-old Adam Raine and 14-year-old Sewell Setzer III.
Advert
We've also seen reports of ChatGPT singing a 'suicide lullaby' to one user, while another case involved the OpenAI chatbot allegedly convincing a 56-year-old man to murder his own mother.

Now, an accused serial killer in South Korea allegedly used ChatGPT to help plan a series of murders.
Identified by her surname Kim, the 21-year-old woman faces allegations of killing two men by spiking their drinks with benzodiazepines. She had reportedly been prescribed the medication for a mental illness, the Korea Herald and the BBC noted.
Before the men’s deaths, Kim had asked ChatGPT about the risks of administering the drugs, such as: “What happens if you take sleeping pills with alcohol?”, “How much would be considered dangerous?” and “Could it be fatal?”
Kim was initially charged on 11 February for causing death through inflicting bodily injury. However, authorities have since filed murder charges against her after investigators uncovered online evidence suggesting she deliberately planned the killings, the Korea Herald reported.
“Kim repeatedly asked questions related to drugs on ChatGPT,” an investigator told the newspaper. “She was fully aware that consuming alcohol together with drugs could result in death.”
The first attack took place on 28 January at 9:24 pm when Kim entered a motel in Suyu-dong, Gangbuk-gu, with a man in his twenties and left alone two hours later. The man was found dead on the bed the next evening.

Her second attack on 9 February followed a similar pattern in a different motel with another man in his 20s.
Back in December, Kim had also reportedly attempted to kill a man she was dating at the time by giving him a sedative-laced drink in a café parking lot in Namyangju, Gyeonggi province. Although the man lost consciousness, he survived and was not in a life-threatening condition.
While Kim admits she secretly drugged the victims' beverages with her prescribed pills, she insists there was no intent to murder them.
The case adds to growing concerns about AI safety and the need for stronger guardrails to prevent the technology from being used to plan or facilitate harmful acts.
OpenAI recently discontinued its ChatGPT-4o model which had been criticised for being overly agreeable and released ChatGPT 5.0, which was designed to be less personal and accommodating.
UniladTech have reached out to OpenAI for comment.
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.