• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

Home> News> AI

Updated 10:48 13 Aug 2025 GMT+1Published 10:40 13 Aug 2025 GMT+1

Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

The man turned to ChatGPT for medical guidance

Rebekah Jordan

Rebekah Jordan

A man was hospitalised with a 19th-century disease after following ChatGPT's health advice.

Medical AI is rapidly being explored by hospitals and health experts around the world as a potential tool for improving patient care.

In 2024, the UK's National Health Service (NHS) was reportedly trialling a 'superhuman' version of AI that can supposedly predict the risk of disease.

Meanwhile, xAI CEO Elon Musk is urging people to use his Grok chatbot for medical guidance after a man claimed it helped to 'cure' his chronic illness.

Advert

However, a recent case took a turn for the worse and shows the serious dangers of relying on AI for health advice without proper medical supervision.

The case involved a 60-year-old man who decided to completely eliminate sodium chloride (table salt) from his diet after researching its adverse health effects.

ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)
ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)

While many articles advised reducing salt intake, the man wanted to eliminate it altogether. As such, he turned to ChatGPT for guidance on alternatives.

The OpenAI chatbot suggested sodium bromide as a substitute for salt.

After three months on his ChatGPT-recommended diet, the patient developed worrying symptoms such as facial acne, cherry angiomas, fatigue, insomnia, excessive thirst, poor coordination and skin rashes.

More troubling were the increasing paranoia and both auditory and visual hallucinations which he experienced as a result.

The man who had 'no past psychiatric or medical history' became convinced his neighbour was trying to poison him, which eventually led to him needing to be hospitalised. His attempt to escape medical care resulted in an 'involuntary psychiatric hold for grave disability.'

It was discovered that the man was suffering from bromism, chronic bromide poisoning that was common in the late 19th and early 20th centuries, roughly from the 1880s to the 1930s.

The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)
The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)

During that era, bromide was widely used in sedatives, sleep aids, and over-the-counter headache remedies because it was considered safer than many alternatives. The problem with bromide is that it's excreted extremely slowly from the body, so repeated use leads to toxic build-up.

So, even though ChatGPT's statement was technically correct, it's more relevant to industrial cleaning and definitely not something safe for humans to consume.

In a paper published in the Annals of Internal Medicine, researchers believe the man used an older ChatGPT model (3.5 or 4.0) for his enquiries. Since they couldn't access his chat history, the paper's authors recreated similar conversations with ChatGPT 3.5 to understand what went wrong.

They found that while the AI's response mentioned that 'context matters,' it failed to provide specific health warnings or ask follow-up questions about why the person wanted this information, which is something a medical professional would certainly do.

The paper authors added that the case showed 'how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.'

Back in its heyday, bromism was often misdiagnosed as alcoholism or nervous breakdowns until doctors realised patients weren't drinking but were taking bromide-based 'nerve tonics' daily. Severe cases resulted in psychosis, tremors or even comas.

Recognising the dangers, the US government restricted bromide use in over-the-counter medicines in 1975, making cases like this extremely rare in modern times.

Featured Image Credit: Cheng Xin/Contributor/Getty)
AI
ChatGPT
Health
Ozempic

Advert

Advert

Advert

Choose your content:

8 hours ago
11 hours ago
13 hours ago
  • Jessie Casson / Getty
    8 hours ago

    Bombshell study reveals sixteen diseases found to increase risk of dementia

    This could lead to increased prevention

    Science
  • LISE ASERUD/NTB Scanpix/AFP via Getty Images
    11 hours ago

    Inside mysterious 'doomsday vault' created to save life in apocalyptic event dubbed ‘modern-day Noah’s Ark’

    The vault is buried deep in the mountain behind five sets of metal doors

    News
  • SAUL LOEB/AFP via Getty Images
    13 hours ago

    Truth behind bizarre rumor that Donald Trump 'pooed himself' in recent meeting

    An unfortunate rumor has been circulating about the president

    News
  • Bloomberg / Contributor / Getty
    13 hours ago

    Donald Trump posts shocking video of Barack and Michelle Obama to millions, depicting them as monkeys

    The President of the United States has been accused of going too far

    News
  • YouTuber breaks silence with health update after contracting extremely rare disease
  • Man used ChatGPT to analyze a decade of Apple Watch data and immediately called his doctor afterwards
  • US Woman claims ChatGPT uncovered cause of her cancer she was told was just 'bad luck'
  • Warning issued to Americans as OpenAI launches controversial ChatGPT Health app