• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

Home> News

Updated 16:11 27 May 2025 GMT+1Published 16:08 27 May 2025 GMT+1

Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

His mother is now suing the company behind the chatbot

Rikki Loftus

Rikki Loftus

Warning: This article contains discussion of suicide which some readers may find distressing.

A judge has made a groundbreaking decision in the case of a teenager who took his own life after his mom claims he ‘fell in love’ with an AI chatbot.

The 14-year-old had spent the final months of his life having conversations with an artificially intelligent chatbot on the server Character.AI.

Advert

Sewell Setzer III from Orlando, Florida, died by suicide in February last year after ‘falling in love’ with the bot.

A lawsuit has been filed by the child’s mother, Megan L. Garcia, who argues that the technology has an addictive design.

While the company tried to argue that its chatbot is protected by the First Amendment, a judge has now made a groundbreaking ruling on the matter.

The US federal judge has decided that AI bots are not protected by the First Amendment, allowing Garcia to proceed with her suit.

Advert

Garcia said: “I feel like it’s a big experiment, and my kid was just collateral damage.”

The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)
The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)

Speaking to the New York Times, she also said that the loss is ‘like a nightmare’.

Garcia added: “You want to get up and scream and say, ‘I miss my child. I want my baby’.”

Advert

Setzer, who was diagnosed with mild Asperger’s syndrome as a child, knew that the ‘people’ he was talking to weren’t real, but he formed an attachment and his family have said that the teen would ceaselessly text with the online chatbots.

Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.

Despite this, Setzer formed an attachment to a character named Dany, named after the Game Of Thrones character Daenerys Targaryen.

Dany offered the teen advice and always texted him back, but sadly, his loved ones noticed him becoming reclusive.

Advert

Not only did his grades begin to suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Upon arriving home from school each night, they say Setzer - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.

An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)
The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)

Advert

Setzer previously expressed thoughts of suicide to his chatbot, writing: “I think about killing myself sometimes.”

The AI bot replied: “And why the hell would you do something like that?”

In a later message, the bot penned: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Setzer reportedly replied: “Then maybe we can die together and be free together.”

In the minutes that followed, he took his own life.

Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.

LADbible Group has also reached out for comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Social Media Victims Law Center
News
AI

Advert

Advert

Advert

  • Florida teen took his own life after 'forming relationship' with AI chatbot
  • Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot
  • Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life

Choose your content:

13 hours ago
14 hours ago
16 hours ago
  • 13 hours ago

    King Charles seemingly takes aim at Trump amid claims Canada is 'considering' becoming 51st state

    The POTUS is determined to claim the Great White North

    News
  • 14 hours ago

    Elon Musk rips into Donald Trump's 'big beautiful tax bill'

    The new bill didn't get approval from the world's richest man

    News
  • 14 hours ago

    Jaw-dropping moment crypto kidnap victim escapes after three weeks of alleged torture in New York apartment

    The victim was tortured to gain access to his Bitcoin wallet

    News
  • 16 hours ago

    Treasure trove of most rare metal on Earth was uncovered beneath an ancient river in India

    This discovery could change the future of the tech industry forever

    Science