Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

Home> News> AI

Man hospitalized with rare 19th-century disease after taking health advice from ChatGPT

The man turned to ChatGPT for medical guidance

A man was hospitalised with a 19th-century disease after following ChatGPT's health advice.

Medical AI is rapidly being explored by hospitals and health experts around the world as a potential tool for improving patient care.

In 2024, the UK's National Health Service (NHS) was reportedly trialling a 'superhuman' version of AI that can supposedly predict the risk of disease.

Meanwhile, xAI CEO Elon Musk is urging people to use his Grok chatbot for medical guidance after a man claimed it helped to 'cure' his chronic illness.

However, a recent case took a turn for the worse and shows the serious dangers of relying on AI for health advice without proper medical supervision.

The case involved a 60-year-old man who decided to completely eliminate sodium chloride (table salt) from his diet after researching its adverse health effects.

ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)
ChatGPT suggested sodium bromide as a substitute for salt. (Cheng Xin/Contributor/Getty)

While many articles advised reducing salt intake, the man wanted to eliminate it altogether. As such, he turned to ChatGPT for guidance on alternatives.

The OpenAI chatbot suggested sodium bromide as a substitute for salt.

After three months on his ChatGPT-recommended diet, the patient developed worrying symptoms such as facial acne, cherry angiomas, fatigue, insomnia, excessive thirst, poor coordination and skin rashes.

More troubling were the increasing paranoia and both auditory and visual hallucinations which he experienced as a result.

The man who had 'no past psychiatric or medical history' became convinced his neighbour was trying to poison him, which eventually led to him needing to be hospitalised. His attempt to escape medical care resulted in an 'involuntary psychiatric hold for grave disability.'

It was discovered that the man was suffering from bromism, chronic bromide poisoning that was common in the late 19th and early 20th centuries, roughly from the 1880s to the 1930s.

The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)
The man became convinced his neighbour was trying to poison him. (Witthaya Prasongsin/Getty)

During that era, bromide was widely used in sedatives, sleep aids, and over-the-counter headache remedies because it was considered safer than many alternatives. The problem with bromide is that it's excreted extremely slowly from the body, so repeated use leads to toxic build-up.

So, even though ChatGPT's statement was technically correct, it's more relevant to industrial cleaning and definitely not something safe for humans to consume.

In a paper published in the Annals of Internal Medicine, researchers believe the man used an older ChatGPT model (3.5 or 4.0) for his enquiries. Since they couldn't access his chat history, the paper's authors recreated similar conversations with ChatGPT 3.5 to understand what went wrong.

They found that while the AI's response mentioned that 'context matters,' it failed to provide specific health warnings or ask follow-up questions about why the person wanted this information, which is something a medical professional would certainly do.

The paper authors added that the case showed 'how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes.'

Back in its heyday, bromism was often misdiagnosed as alcoholism or nervous breakdowns until doctors realised patients weren't drinking but were taking bromide-based 'nerve tonics' daily. Severe cases resulted in psychosis, tremors or even comas.

Recognising the dangers, the US government restricted bromide use in over-the-counter medicines in 1975, making cases like this extremely rare in modern times.

Featured Image Credit: Cheng Xin/Contributor/Getty)

Choose your content: