uniladtech homepage
  • News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots

Home> News> Tech News

Published 15:24 25 Oct 2024 GMT+1

Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots

Sewell Setzer III was reportedly lured in by a Game of Thrones-inspired chatbot

Tom Chapman

Tom Chapman

google discoverFollow us on Google Discover
Featured Image Credit: Tech Justice Law Project / US District Court Middle District of Florida Orlando Division
Social Media
AI

Advert

Advert

Advert

Warning: This article contains discussion of suicide which some readers may find distressing.

There's a tragic side to our new reliance on artificial intelligence, and while some of us get excited about how it can make our lives easier or even highlight deadly diseases before even the doctors can, there are growing concerns about how it can negatively impact our lives.

There have always been fears about the potential to fall in love with machines, with 2013's Her being a whole movie with that exact premise. This has already become shockingly relevant due to the tragic story of a 14-year-old boy who took his own life after 'falling in love' with an internet chatbot.

A lawsuit was filed in America after Sewell Setzer III from Orlando, Florida, spent months speaking to chatbots on the Character.AI server.

Advert

Sewell Setzer II and his mother, Megan Garcia (Social Media Victims Law Center)
Sewell Setzer II and his mother, Megan Garcia (Social Media Victims Law Center)

Setzer tragically took his life on February 28, 2024. His mother claims he spent an excessive amount of time talking to chatbots and that Character.AI is guilty of promoting its addictive design. She also adds that her son would ceaselessly text with the online chatbots,

Although the Character.AI site warns users that 'everything Characters say is made up', Sewell's diary shows how he grew attached to bots that he'd made himself or had been created by others.

He apparently became attached to a chatbot called Dany (after the Game of Thrones character), and despite taking part in five therapy sessions per week before his death, his family say he became increasingly withdrawn.


Character.AI has spoken out about the lawsuit, offering its condolences to Sewell Setzer III's family. It reiterates that it takes the safety of its users very seriously and is currently implementing a number of new procedures.

The Community Notes on the response reiterate that while Character.AI isn't directly being connected to Setzer taking his own life, it is being chastised for a lack of proper guidance or intervention.

Promising there are new 'guardrails' for users under the age of 18, Character.AI has hired a Head of Trust and Safety and a Head of Content Policy, while also bringing on more engineering safety support team members.

Importantly, there's a new pop-up resource that will direct users to the National Suicide Prevention Lifeline if certain phrases related to self-harm or suicide are inputted.

There are plans to alter the models available to those under the age of 18, a revised disclaimer reminding users bots aren't real, and notification when someone has spent over an hour on the platform.

Character.AI vows to continue implementing new policies and features as its evolves, hopefully ensuring situations like the Setzer case can be prevented in the future.

If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Choose your content:

2 days ago
  • Fort Lauderdale Sheriff's Office
    2 days ago

    What sentence streamer Clavicular is likely to face following arrest and ban from Kick

    This isn't Braden Peters' first brush with the law

    News
  • Antena 3
    2 days ago

    25-year-old woman with depression to be euthanised after family lose legal battle has died

    She became the first Spanish citizen to quality for mental health-related assisted dying

    News
  • Thanasis Zovoilis / Getty
    2 days ago

    First evidence-based government guidance on screen time will set new limits on kids' scroll time

    It's doing far more damage than just giving them square eyes

    News
  • UNILAD Tech
    2 days ago

    World’s largest battery made of sand powers an entire town and could be the key to solving climate change

    The sustainable alternative keeps the town of Pornainen, Finland, warm

    News
  • How AI could influence in assisted dying cases following decision of 25-year-old to die by euthanasia
  • Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot
  • Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot
  • Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life