BREAKING NEWS
Breaking

728x90

.

468x60

ChatGPT helps ‘uncover rare condition’ in woman after years of misdiagnosis



ChatGPT, the generative AI chatbox, has helped identify a woman’s rare condition after years of being misdiagnosed by doctors.

 

The 23-year-old lady identified as Phoebe Tesoriere from Cardiff, Wales, claimed she was repeatedly misdiagnosed, being told she had anxiety, depression, and epilepsy.

 

She also alleged she was warned she could be treated as a mental health patient if she continued returning to A&E.

 

Tesoriere said she understands the hospital’s challenge in diagnosing her as she had also initially believed her condition was linked to complications from her childhood.

 

 

“All my childhood I had a limp. I was born without a hip socket and had operations as a baby, so thought it was to do with that,” BBC quoted Tesoriere to have said.

 

As a child, Tesoriere also experienced balance issues and was assessed for dyspraxia, a condition that affects coordination, but was later ruled out.

 

At 19, she collapsed and suffered a seizure at work, but said doctors attributed the incident to anxiety, which was then added to her medical records.

 

 

“I had no history of anxiety, I was a really happy, bubbly person,” she said.

 

In 2022, she was diagnosed with epilepsy and placed on medication. However, in December 2024, her health deteriorated again as she struggled to retain her medication, leading to recurring seizures.

 

Her condition worsened, affecting her ability to walk. She was later misdiagnosed with Todd’s paralysis — a temporary neurological condition that can occur after a seizure.

 

In January 2025, she fell down the stairs, resulting in a three-month hospital stay, with tests failing to provide clear answers.

 

 

Months later, in July 2025, she suffered a severe seizure that left her in a coma for three days. Following her recovery, she claimed a doctor told her she did not have epilepsy, but anxiety instead.

 

It was after the three-day coma, Tesoriere turned to the AI chatbot, inputting her symptoms, which suggested several possible conditions, including hereditary spastic paraplegia (HSP).

 

HSP is a group of inherited neurodegenerative disorders characterized by progressive weakness and spasticity (stiffness) in the legs due to upper motor neuron degeneration.

 

She presented this to her doctor, who agreed it could be a “plausible reason”, and genetic testing later confirmed the diagnosis.

 

 

Tesoriere said she decided to put in her symptoms in the AI chatbox because she found the experience “really lonely”.

 

A spokesperson for the Cardiff and Vale health board said they could not provide details on the case due to patient confidentiality, but noted that Tesoriere could reach out to their concerns team to discuss her experience.

 

“As it would be inappropriate to comment on an individual patient case, we are unable to comment further,” the spokesperson said.

 

“Phoebe is welcome to contact our concerns team should she wish to discuss any aspect of the care she received at Cardiff and Vale University Health Board.”

 

 

Rebeccah Tomlinson, a general practitioner serving Cardiff and Vale of Glamorgan, said doctors face growing pressure and cannot be expected to know everything, noting that patients sharing information helps guide consultations.

 

She added that AI tools can serve as a starting point, but should be followed by proper medical consultation, stressing that care works best when both doctor and patient communicate openly.

 

“It’s difficult for GPs to know everything. With the pressure on the NHS, we have to know even more. Patients coming with information helps me understand what they are thinking and guide the discussion more clearly,” she said.

 

“It’s good as a starting talking point [AI tools] which should be followed by going to a medical professional to discuss concerns further. It’s helpful for patients to come armed with information but the GP has to be open and receptive to the patient. General practice has to be a two-way conversation.”

 

Over the years, debate has continued over the use of ChatGPT for medical purposes.

 

A study from the University of Oxford found that AI chatbots can provide inaccurate and inconsistent medical advice, posing potential risks to users.

 

The research showed that users often receive a mix of reliable and misleading responses, making it difficult to determine what information to trust.

 

In January, a new ChatGPT feature was launched in the United States, aimed at analysing patients’ medical records to provide “better answers”, according to developer OpenAI.

 

Click to signup for FREE news updates, latest information and hottest gists everyday


Advertise on NigerianEye.com to reach thousands of our daily users
« PREV
NEXT »

No comments

Kindly drop a comment below.
(Comments are moderated. Clean comments will be approved immediately)

Advert Enquires - Reach out to us at NigerianEye@gmail.com