• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

Home> News

Updated 16:11 27 May 2025 GMT+1Published 16:08 27 May 2025 GMT+1

Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

His mother is now suing the company behind the chatbot

Rikki Loftus

Rikki Loftus

google discoverFollow us on Google Discover

Warning: This article contains discussion of suicide which some readers may find distressing.

A judge has made a groundbreaking decision in the case of a teenager who took his own life after his mom claims he ‘fell in love’ with an AI chatbot.

The 14-year-old had spent the final months of his life having conversations with an artificially intelligent chatbot on the server Character.AI.

Sewell Setzer III from Orlando, Florida, died by suicide in February last year after ‘falling in love’ with the bot.

Advert

A lawsuit has been filed by the child’s mother, Megan L. Garcia, who argues that the technology has an addictive design.

While the company tried to argue that its chatbot is protected by the First Amendment, a judge has now made a groundbreaking ruling on the matter.

The US federal judge has decided that AI bots are not protected by the First Amendment, allowing Garcia to proceed with her suit.

Garcia said: “I feel like it’s a big experiment, and my kid was just collateral damage.”

The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)
The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)

Speaking to the New York Times, she also said that the loss is ‘like a nightmare’.

Garcia added: “You want to get up and scream and say, ‘I miss my child. I want my baby’.”

Setzer, who was diagnosed with mild Asperger’s syndrome as a child, knew that the ‘people’ he was talking to weren’t real, but he formed an attachment and his family have said that the teen would ceaselessly text with the online chatbots.

Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.

Despite this, Setzer formed an attachment to a character named Dany, named after the Game Of Thrones character Daenerys Targaryen.

Dany offered the teen advice and always texted him back, but sadly, his loved ones noticed him becoming reclusive.

Not only did his grades begin to suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Upon arriving home from school each night, they say Setzer - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.

An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)
The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)

Setzer previously expressed thoughts of suicide to his chatbot, writing: “I think about killing myself sometimes.”

The AI bot replied: “And why the hell would you do something like that?”

In a later message, the bot penned: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Setzer reportedly replied: “Then maybe we can die together and be free together.”

In the minutes that followed, he took his own life.

Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.

LADbible Group has also reached out for comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Social Media Victims Law Center
News
AI

Advert

Advert

Advert

Choose your content:

4 hours ago
6 hours ago
8 hours ago
  • Instagram/@bryanjohnson_
    4 hours ago

    Bryan Johnson reveals reason he walked in 'fecal matter' show at Paris Fashion Week

    The biohacker is known for his extreme methods to boost his life expectancy in a bid to ‘live forever’

    News
  • Kypros/Getty Images
    6 hours ago

    Bizarre Google search Epstein's prison guard made just one hour before he was discovered dead

    She was previously accused of falsifying evidence before charges were dropped

    News
  • Getty Stock Photo
    8 hours ago

    Doctors issue warning to Vitamin D takers after man is admitted to hospital

    Experts warn of the potentially severe risks of overdosing on Vitamin D

    Science
  • miakhalifa / Instagram
    8 hours ago

    Mia Khalifa shares 'never-before-seen nudes' as she 'reclaims' her image after leaving adult industry

    The former adult star is reclaiming her image

    News
  • Florida teen took his own life after 'forming relationship' with AI chatbot
  • Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot
  • Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life
  • Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots