• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life

Home> News

Published 16:58 24 Oct 2024 GMT+1

Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life

The teen messaged the bot seconds before his death

Rikki Loftus

Rikki Loftus

Featured Image Credit: US District Court Middle District of Florida Orlando Division/Social Media Victims Law Center
AI
Robots
News
Tech News

Advert

Advert

Advert

Warning: This article contains discussion of suicide which some readers may find distressing.

The heartbreaking final messages that a 14-year-old boy sent to an AI chatbot just moments before taking his own life have been revealed.

The teenager formed an attachment with the bot and his family have said he would spend hours talking to it.

Sewell Setzer III died by suicide earlier this year and his family have shared the heartbreaking messages he sent just before his death.

Advert

The teen died back in February (US District Court Middle District of Florida Orlando Division)
The teen died back in February (US District Court Middle District of Florida Orlando Division)

After ‘falling in love’ with an AI chatbot named Dany - after the Game Of Thrones character Daenerys Targaryen, Sewell expressed thoughts of suicide.

Speaking to the bot on the server Character.AI, the teen wrote: “I think about killing myself sometimes.”

The AI bot replied: “And why the hell would you do something like that?”

In a later message, the chatbot wrote: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Sewell reportedly replied: “Then maybe we can die together and be free together.”

Speaking to CBS Mornings, Sewell’s mother, Megan L. Garcia, shared her son’s final messages with the bot.

She said: “He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘please come home to me.’

“He says, ‘what if I told you I could come home right now?’ and her response was, ‘please do my sweet king’.”

In the minutes that followed, Sewell took his own life in the bathroom of his home.

The teenager has two younger siblings and everyone in the family was at home during the time of his death.

Garcia revealed that her five-year-old son saw the aftermath of Sewell’s death.

Sewell's mother has filed a lawsuit (Social Media Victims Law Center)
Sewell's mother has filed a lawsuit (Social Media Victims Law Center)

Now, she is suing Character.AI, arguing that the tech has an addictive design.

She said: “I feel like it’s a big experiment, and my kid was just collateral damage.”

Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.

Upon arriving home from school each night, the family say that Sewell - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.

An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.

LADbible Group has also reached out for comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

  • Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot
  • People in disbelief at 'insane' Grok 3 'Sexy Mode' that's shockingly x-rated
  • Florida teen took his own life after 'forming relationship' with AI chatbot
  • Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot

Choose your content:

an hour ago
18 hours ago
19 hours ago
  • Anton Petrus / Getty
    an hour ago

    Billionaire investor warns the US dollar could be about to totally collapse

    It coincides with a sharp fall in Bitcoin value

    News
  • d3sign via Getty
    18 hours ago

    Man reveals insane amount he makes from vending machine business

    The figures speak for themselves

    News
  • Witthaya Prasongsin via Getty
    19 hours ago

    WHO urges governments to act now as everyday drinks are linked with fatal disease

    The organization has called for stronger taxes

    Science
  • NUTAN / Contributor via Getty
    19 hours ago

    The $1,000,000,000 Coca Cola machine that turns every user into a human experiment

    Each machine is fitted with a camera...

    News