• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

Home> News

Updated 16:11 27 May 2025 GMT+1Published 16:08 27 May 2025 GMT+1

Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

His mother is now suing the company behind the chatbot

Rikki Loftus

Rikki Loftus

Warning: This article contains discussion of suicide which some readers may find distressing.

A judge has made a groundbreaking decision in the case of a teenager who took his own life after his mom claims he ‘fell in love’ with an AI chatbot.

The 14-year-old had spent the final months of his life having conversations with an artificially intelligent chatbot on the server Character.AI.

Sewell Setzer III from Orlando, Florida, died by suicide in February last year after ‘falling in love’ with the bot.

Advert

A lawsuit has been filed by the child’s mother, Megan L. Garcia, who argues that the technology has an addictive design.

While the company tried to argue that its chatbot is protected by the First Amendment, a judge has now made a groundbreaking ruling on the matter.

The US federal judge has decided that AI bots are not protected by the First Amendment, allowing Garcia to proceed with her suit.

Garcia said: “I feel like it’s a big experiment, and my kid was just collateral damage.”

Advert

The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)
The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)

Speaking to the New York Times, she also said that the loss is ‘like a nightmare’.

Garcia added: “You want to get up and scream and say, ‘I miss my child. I want my baby’.”

Setzer, who was diagnosed with mild Asperger’s syndrome as a child, knew that the ‘people’ he was talking to weren’t real, but he formed an attachment and his family have said that the teen would ceaselessly text with the online chatbots.

Advert

Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.

Despite this, Setzer formed an attachment to a character named Dany, named after the Game Of Thrones character Daenerys Targaryen.

Dany offered the teen advice and always texted him back, but sadly, his loved ones noticed him becoming reclusive.

Not only did his grades begin to suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Advert

Upon arriving home from school each night, they say Setzer - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.

An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)
The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)

Setzer previously expressed thoughts of suicide to his chatbot, writing: “I think about killing myself sometimes.”

Advert

The AI bot replied: “And why the hell would you do something like that?”

In a later message, the bot penned: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Setzer reportedly replied: “Then maybe we can die together and be free together.”

In the minutes that followed, he took his own life.

Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.

LADbible Group has also reached out for comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Social Media Victims Law Center
News
AI

Advert

Advert

Advert

  • Florida teen took his own life after 'forming relationship' with AI chatbot
  • Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot
  • Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life
  • Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots

Choose your content:

9 hours ago
10 hours ago
  • The Diary of a CEO / YouTubeThe Diary of a CEO / YouTube
    9 hours ago

    Neuroscientist's astonishing story on how she communicates with her dead husband every day

    Her discovery is a 'revelation'

    Science
  • d3sign via Gettyd3sign via Getty
    10 hours ago

    Shocking study reveals where AI chatbots really get their information from

    It might not be where you expect

    News
  • SOPA Images / Contributor via GettySOPA Images / Contributor via Getty
    10 hours ago

    Google take ‘giant leap’ with launch of ‘Nano Banana' that will change the AI game forever

    Google makes image editing easier than ever

    News
  • Andrew Harnik / Staff / GettyAndrew Harnik / Staff / Getty
    10 hours ago

    OpenAI warn they could call the police over your ChatGPT conversations

    It's privacy versus prevention

    News