Claire knew her son had a parasite in his stomach. Her doctors, however, thought otherwise.
Yes, the little boy would get stomach pains after eating, but without diarrhoea, there probably wasn’t much to worry about. Give it time, they assured her.
It was while on holiday in Greece that Claire’s son started to feel unwell and with his return to school imminent, she felt nervous about sending him back without knowing what was wrong with him.
Undeterred by her GP, she logged into ChatGPT, typing: ‘Kids develop tummy ache five days after a holiday – what could it be?’
The AI chatbot suggested, among other things, that he could be infected with a parasite, such as the 5mm-long cryptosporidium.
Armed with this new information, she returned to her doctor, who insisted her son would be sicker if he had one.
To be safe, the GP offered a stool sample test, which came back positive – Claire’s son had, in fact, caught a parasite, most likely from a swimming pool.
‘I wouldn’t normally advocate for AI and diagnoses, but in this instance, it helped us,’ Claire tells Metro. ‘I felt so glad I checked, and that it was backed up by the results, even if the GP was highly doubtful.’
The mum is among the hundreds of millions of people using AI-powered tools to ask for health information and advice every week.
Despite its soaring popularity, healthcare experts are keen to warn Metro that while the model is knowledgeable enough to pass medical licensing exams, it in no way replaces a doctor.
‘I don’t expect a rushed doctor to think of everything’
Every person who uses ChatGPT said they have their reasons for turning to the general-purpose chatbot for medical advice.
Lisa Freeman, 42, believes that by using it, she’s taking the strain off the NHS, especially as hours-long queues for A&E and days-long waits for GP appointments have become all too common.
‘I don’t expect a doctor – especially a busy, overworked one – to be able to think of everything in a rushed phone call when they were talking to a toddler about chicken pox five minutes ago,’ she explains.
Another draw to AI for some people is its perceived lack of judgment.
Cybersecurity advisor Bob Gourley says he loads ChatGPT when he’s been up all night worrying about symptoms.
‘I would kind of describe what was going on with my body in a clumsy way, so “my chest tightens when I breathe in deeply” or “I’ve got a weird rash for the past two weeks,” and then ask for any ideas for what might be going on or for questions to ask my doctor,’ he explains.
‘The most comforting thing is that it doesn’t judge me about my questions. It doesn’t roll its eyes or say “that’s an absurd thing to worry about”.’
‘I tell ChatGPT to act like a medical professional’
Whether it’s a stubbed toe or a morning headache – there aren’t many ailments Edward Frank Morris hasn’t turned to ChatGPT for input on.
Edward, 33, has spent years using the app for health advice, but says it’s his friend’s recent hospital stay for a heart attack that really tells the story of why it’s so useful.
‘My friend, in his late 70s, plugged the medical report into ChatGPT after he felt a bit disenchanted by the bed manner of the doctor,’ the AI advisor explains.
‘The app, however, was empathetic, told him exactly what the report was saying, and how severe it was in a gentle way.’
Edward, from Southampton, often gives ChatGPT the same prompt before consulting it: ‘Act like a medical professional.’
In one conversation that Edward shared with Metro, ChatGPT responded: ‘Absolutely. I’m glad you reached out — I’ll approach this as a medical professional would: calmly, clearly and with care.’
This ’empathy’, Edward says, is why he thinks the elderly would especially benefit from AI models.
‘Having access to technology that can be kind, talk at your level, and even offer artificial comfort is huge,’ he says. ‘Especially when each sickness, chronic illness, or even a fall could lead you to death.’
It isn’t without error
Chatbots can simplify medical jargon, write questions to ask doctors or walk through a treatment plan.
But AI isn’t without problems – a man wound up in hospital with an 18th-century disease after ChatGPT allegedly gave him dangerous diet advice.
It’s not just patients embracing AI, either. More than half of GPs use it in their clinical practice, according to a survey by the Royal College of General Practitioners (RCGP) and the health organisation Nuffield Trust.
However, it’s worth pointing out that most use algorithmic systems for note-taking, rather than clinical judgement.
Dr Becks Fisher, the director of research and policy at the Nuffield Trust, says GPs like her ‘expect’ patients to come in armed with ChatGPT advice.
‘It’s very difficult to make broad generalisations about the accuracy of AI tools, partly because how useful they are will depend on what information the user prompts them with,’ she adds.
What makes ChatGPT alluring to patients is its delivery, often dispensing information in an overly-confident, sometimes sycophantic way.
This has led Deepak Aulak, a dentist at Toothfairy, to have chats with patients convinced they’re entitled to NHS implants because a bot told them.
‘In reality, implants aren’t routinely funded and are usually restricted to rare cases,’ he explains. ‘It can be awkward having to reset expectations when the information has been delivered so confidently online.’
In a way, the AI chatbot is the latest version of patients Googling their symptoms, says Professor Victoria Tzortziou-Brown, the chair of RCGP.
‘This isn’t all bad; it’s encouraging to see patients being curious about their health and search engines, many of which do offer AI-generated results, and are an obvious source of information,’ she says.
‘But with AI chatbots, it’s not always clear where the information is being drawn from or how accurate it is, and the results could contain content that is neither evidence-based nor trustworthy.’
The AI-powered summaries included in some Google search results, for example, have been found to include bogus health advice.
One AI Overview gave phoney information about liver tests that could leave people with serious liver disease believing they were healthy.
Every expert Metro spoke with said that people should never be discouraged from learning about their health. Rather than taking the ‘GP’ part of ChatGPT literally, they should go to reputable resources like the NHS and, above all, their doctor.
‘An AI chatbot cannot replace a conversation with a clinician who knows the patient, understands the context and can make safe, evidence-based decisions,’ adds Professor Tzortziou-Brown.
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.