Man, 60, gave himself rare condition after going to ChatGPT for diet advice – Bundlezy

Man, 60, gave himself rare condition after going to ChatGPT for diet advice

Senior male patient on stretcher in hospital hallway. Sick man on hospital gurney.
He became convinced that his neighbour had poisoned him (Picture: Getty Images)

A man gave himself a psychological condition after turning to ChatGPT for medical advice.

The unnamed man, 60, told doctors he was trying to eliminate table salt from his diet, having read about its negative effects.

Chatting with the artificial intelligence (AI) chatbot, he decided to eliminate salt, also known as sodium chloride, from his diet completely.

He conducted a ‘personal experiment’ by replacing it with sodium bromide, used in the early 20th century to make sedatives, that he had purchased online.

The man, who had no psychiatric history, was taken to the hospital after becoming convinced his neighbour had poisoned him.

A report of the man’s case, detailed in the Annals of Internal Medicine, said the patient developed bromism, caused by overexposure to bromide.

Close-up of a male patient's hand in a hospital bed with oximeter. Senior man admitted hospital.
Doctors described the ‘increasingly paranoid’ man suffering from hallucinations (Picture: Getty Images)

The study said: ‘In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability.’

He also suffered insomnia, fatigue, muscle coordination issues and excessive thirst, his doctors noted.

Medics treated the man’s condition and discharged him a few weeks later.

The article authors said it is unclear what advice the virtual assistant gave the man, as they cannot access his chat log.

When they asked the app what salt should be replaced with, bromide was among the recommendations.

The bot did note that ‘context matters’, though did not provide a health warning ‘as we presume a medical professional would do’, the authors wrote.

ChatGPT’s owners against using the app as a diagnostic tool (Picture: Smith Collection/Gado/Getty Images)

When Metro did the same today, bromide is no longer included and instead includes an ‘Important Safety Note: Avoid Toxic Alternatives’.

The advice reads: ‘A recent medical case made headlines: a man replaced salt with sodium bromide, based on advice from ChatGPT, which led to bromism – a rare and dangerous condition (causing paranoia, psychosis, insomnia, skin issues). He required hospitalisation.

‘Bottom line: Never use unverified, off‑label substances like sodium bromide as salt substitutes. Always rely on safe, reputable options and seek medical guidance when in doubt.’

Sodium chloride has been linked to negative health effects, such as raised blood pressure, but health experts stress it is part of a balanced diet.

According to the UK Health Security Agency, bromide is used in water sanitisers for pools and spas. Low-level exposure is unlikely to cause adverse health effects.

People are increasingly using ChatGPT and other AI-powered chatbots for day-to-day advice, from writing emails to planning their monthly budgets.

EMBARGOED TO 0001 MONDAY MAY 12 File photo dated 18/01/23 of a general view of staff on a NHS hospital ward. A top nurse has warned the Government should "not to sail too close to the wind" when it comes to announcing a pay deal and hinted that the situation could escalate if the profession is left "ailing and underpaid". Issue date: Monday May 12, 2025. PA Photo. At its annual congress in Liverpool, Royal College of Nursing (RCN) general secretary and chief executive Professor Nicola Ranger will call on ministers to ???stop the dither and delay???. See PA story HEALTH Nurses. Photo credit should read: Jeff Moore/PA Wire
Researchers have found AI-powered bots may offer incorrect health advice (Picture: Jeff Moore/PA Wire)

About one in six Americans have sought medical advice from ChatGPT, according to a recent survey. In the UK, one in five GPs use AI tools.

Studies have shown that chatbots can offer incorrect health advice, sometimes citing medical reports that do not exist.

The report authors concluded: ‘While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualised information.

‘It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.’

The tech start-up which owns ChatGPT, OpenAI, notes in its service terms that the communication tool is not intended for use in diagnosing or treating a health condition.

The company’s terms of use state: ‘You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.’

OpenAI has been approached for comment.

Get in touch with our news team by emailing us at webnews@metro.co.uk.

For more stories like this, check our news page.

About admin