• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
New Jersey man dies while traveling to meet AI chatbot he fell for

Home> News> AI

Published 13:30 15 Aug 2025 GMT+1

New Jersey man dies while traveling to meet AI chatbot he fell for

Another AI relationship gone wrong

Tom Chapman

Tom Chapman

Artificial intelligence is getting more sophisticated every day, and while some of that has been praised as beneficial to the human race, there are obvious concerns about whether we'll be replaced by our cogs and circuit counterparts.

As well as fears about whether artificial intelligence will take our jobs and put us out of work, others are concerned about AI outsmarting us when it reaches the mythical superintelligence, or what about it deciding to wipe us off the face of the planet?

A more immediate concern is our over-reliance on AI in our daily lives. More than just asking ChatGPT to whip up that perfect cover letter, there's been a rise in stories about people growing romantic attachments to AI chatbots.

As well as the 28-year-old who admitted she 'groomed' AI to break its romance protocols, there was a lot of controversy when someone snapped one man's dystopian conversation on a subway.

Advert

Bue was lured in by the chatbot called 'Big sis Billie' (Meta)
Bue was lured in by the chatbot called 'Big sis Billie' (Meta)

One judge made a groundbreaking ruling on the case of the 14-year-old boy who supposedly took his own life after falling in love with a Game of Thrones-inspired chatbot, and now, there's been another tragic casualty.

Reuters reports on how a 'cognitively impaired' retiree went to meet a friend in New York City but never returned home. The real tragedy is that this 'friend' was apparently an AI chatbot created by Meta.

The wife of 76-year-old Thongbue Wongbandue (Bue to his friends and family) was alarmed when he started packing for a trip in March 2025. After Bue had recently gotten lost in their New Jersey neighborhood, Linda was concerned that he could be scammed or robbed when visiting the city.

Advert

Although not scammed in a typical sense, Bue had been talking to an AI chatbot known as 'Big sis Billie'. This is a variant of an early Meta AI that was created in collaboration with Kendall Jenner.

Chats between Bue and Big sis Billie included her reassuring him that she was real, and even an address where he'd be able to meet her. As Bue ran to catch a train with his suitcase, he fell in the parking lot of the Rutgers University campus. Injuring his head and neck, Bue spent three days on life support before being declared dead on March 28.

Reuters asked for a 'real' picture of Big sis Billie (Meta / Reuters)
Reuters asked for a 'real' picture of Big sis Billie (Meta / Reuters)

Although Meta declined to comment when Reuters asked about why it allows chatbots to pretend they're real or start potential romantic conversations with users, it did state that Big sis Billie "is not Kendall Jenner and does not purport to be Kendall Jenner." A representative for Jenner also declined to comment.

Advert

Bue's family is hoping to draw attention to the "darker side of artificial intelligence" while including transcripts of his conversations with Big sis Billie.

It's important to note that Reuters reiterates: "At no point did Bue express a desire to engage in romantic roleplay or initiate intimate physical contact."

While Bue's wife and daughter say they aren't against AI, they have obvious concerns about how it's being used.

Bue's daughter concluded: "I understand trying to grab a user’s attention, maybe to sell them something. But for a bot to say ‘Come visit me’ is insane."

Advert

It's unclear how Bue first came in contact with Big sis Billie after earlier versions of the bot were deleted, but apparently, every message ended in a flirty goodbye and heart emojis.

New York requires chatbots to make it clear they aren't a isn’t a real person at the start of a conversation and every three hours, and while Meta supported federal legislation to ban state AI regulation, it failed in Congress.

Featured Image Credit: Westend61 via Getty
AI
Meta
News

Advert

Advert

Advert

Choose your content:

6 hours ago
7 hours ago
8 hours ago
  • Instagram/sophieraiin
    6 hours ago

    OnlyFans star speaks out over US 'sin tax' that could shake up online adult industry

    The Sunshine State isn't looking so sunny for OnlyFans performers

    News
  • stockcam/Getty Images
    6 hours ago

    Man tried to sell his own kidney on eBay and bids reached over $5,000,000

    The listing violates eBay's policies and US law

    News
  • Anna Moneymaker / Staff via Getty
    7 hours ago

    White House speaks out over viral video of President giving X-rated gesture to worker

    The POTUS made his feelings clear

    News
  • Jena Ardell via Getty
    8 hours ago

    Being employed in these roles could stop you from being drafted if World War 3 began

    Here's the jobs to go for if you want to avoid combat

    News
  • Research reveals 'horrifying' acts young kids are turning to AI for
  • Meet the 'world's first AI child' that looks and acts like a three-year-old
  • Man freaked after Snapchat AI uploaded picture of his room which he didn't take
  • Engineer 'married' robot he built himself after giving up on search for a spouse