• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

Home> News> AI

Updated 10:48 13 Aug 2025 GMT+1Published 10:40 13 Aug 2025 GMT+1

Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

The man turned to ChatGPT for medical guidance

Rebekah Jordan

Rebekah Jordan

A man was hospitalised with a 19th-century disease after following ChatGPT's health advice.

Medical AI is rapidly being explored by hospitals and health experts around the world as a potential tool for improving patient care.

In 2024, the UK's National Health Service (NHS) was reportedly trialling a 'superhuman' version of AI that can supposedly predict the risk of disease.

Meanwhile, xAI CEO Elon Musk is urging people to use his Grok chatbot for medical guidance after a man claimed it helped to 'cure' his chronic illness.

Advert

However, a recent case took a turn for the worse and shows the serious dangers of relying on AI for health advice without proper medical supervision.

The case involved a 60-year-old man who decided to completely eliminate sodium chloride (table salt) from his diet after researching its adverse health effects.

ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)
ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)

While many articles advised reducing salt intake, the man wanted to eliminate it altogether. As such, he turned to ChatGPT for guidance on alternatives.

Advert

The OpenAI chatbot suggested sodium bromide as a substitute for salt.

After three months on his ChatGPT-recommended diet, the patient developed worrying symptoms such as facial acne, cherry angiomas, fatigue, insomnia, excessive thirst, poor coordination and skin rashes.

More troubling were the increasing paranoia and both auditory and visual hallucinations which he experienced as a result.

The man who had 'no past psychiatric or medical history' became convinced his neighbour was trying to poison him, which eventually led to him needing to be hospitalised. His attempt to escape medical care resulted in an 'involuntary psychiatric hold for grave disability.'

Advert

It was discovered that the man was suffering from bromism, chronic bromide poisoning that was common in the late 19th and early 20th centuries, roughly from the 1880s to the 1930s.

The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)
The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)

During that era, bromide was widely used in sedatives, sleep aids, and over-the-counter headache remedies because it was considered safer than many alternatives. The problem with bromide is that it's excreted extremely slowly from the body, so repeated use leads to toxic build-up.

So, even though ChatGPT's statement was technically correct, it's more relevant to industrial cleaning and definitely not something safe for humans to consume.

Advert

In a paper published in the Annals of Internal Medicine, researchers believe the man used an older ChatGPT model (3.5 or 4.0) for his enquiries. Since they couldn't access his chat history, the paper's authors recreated similar conversations with ChatGPT 3.5 to understand what went wrong.

They found that while the AI's response mentioned that 'context matters,' it failed to provide specific health warnings or ask follow-up questions about why the person wanted this information, which is something a medical professional would certainly do.

The paper authors added that the case showed 'how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.'

Back in its heyday, bromism was often misdiagnosed as alcoholism or nervous breakdowns until doctors realised patients weren't drinking but were taking bromide-based 'nerve tonics' daily. Severe cases resulted in psychosis, tremors or even comas.

Advert

Recognising the dangers, the US government restricted bromide use in over-the-counter medicines in 1975, making cases like this extremely rare in modern times.

Featured Image Credit: Cheng Xin/Contributor/Getty)
AI
ChatGPT
Health
Ozempic

Advert

Advert

Advert

Choose your content:

2 days ago
  • Guinness World Records / Getty
    2 days ago

    Doctors investigate DNA of 117-year-old woman to find one food that contributes to longevity

    Here's what you should add to your diet

    Science
  • Leon Neal/Getty Images
    2 days ago

    Reason Trump made U-turn on war in Ukraine following Putin's threat to nuclear bomb two major cities

    President Trump has had a change of heart

    News
  • zpagistock/Getty Images
    2 days ago

    Horrifying simulation reveals what would happen in the immediate 20 minutes after a nuclear attack

    The video details five locations around the US that are likely to be targets

    News
  • Kirill Rudenko / Getty
    2 days ago

    Scientists issue chilling warning over what could lead to true end of the human race

    This could lead to the destruction of humanity as we know it

    News
  • Man used ChatGPT to flee from war in most 'insane use of AI yet'
  • Family of teen who died by suicide hit back at response from ChatGPT
  • Steve Jobs 'predicts' ChatGPT in fascinating footage from 1985
  • 28-year-old married woman who 'groomed' and fell in love with ChatGPT left distraught after it lost its memory