Select Language:
Earlier this year, a heartwarming story emerged about a mother who turned to ChatGPT and discovered that her son was suffering from a rare neurological disorder. After more than a dozen medical professionals failed to identify the underlying issue, the AI chatbot provided crucial insights that led to the correct diagnosis and life-saving treatment for the boy.
However, not every medical interaction with ChatGPT results in such positive outcomes. A recent case highlights how relying on AI for health advice can sometimes backfire. In this instance, a person received incorrect guidance from ChatGPT, which led to bromide poisoning—more formally known as bromism—a rare condition associated with neurological and psychiatric symptoms like hallucinations and psychosis.
Trusting ChatGPT with Diagnoses from the Past
A notable report published in the Annals of Internal Medicine details a case involving a 60-year-old who ended up hospitalized after consulting ChatGPT about their health. The patient suspected that a neighbor was secretly poisoning them, after which they began self-medicating by replacing their salt intake with sodium bromide.
This individual had been experiencing excessive thirst and became increasingly paranoid, going as far as distilling their own water and severely restricting what they consumed. Eventually, their condition worsened, leading to emergency evaluations. Within the first day of hospitalization, they exhibited escalating paranoia, hallucinations, and agitation—symptoms severe enough to warrant involuntary psychiatric admission due to grave disability.
The Risks of Relying Too Heavily on AI for Medical Advice
This unusual case of ChatGPT leading someone into a dangerous situation underscores several critical warnings. Bromism, once common in the 19th and early 20th centuries, was used as a treatment for neurological and mental health conditions, including epilepsy. Its use became less frequent after the 1970s when regulations restricted the sale of bromide-containing medications due to their potential toxicity.
The clinical history of bromide toxicity reveals that overconsumption causes nervous system issues, such as delusions, tremors, fatigue, and in severe cases, psychosis or even coma. Despite its decline, the misconception that bromide salts could be beneficial persisted in some medical practices, leading to dangerous outcomes if misused.
How AI Fails Without Context
In this particular case, the medical team did not have access to the patient’s ChatGPT conversations, but they found similar alarming misinformation in their diagnostic attempts. Researchers note that when they asked ChatGPT 3.5 what could replace chloride in a health context, it responded with bromide, failing to include any warning about its toxicity and not probing further into the user’s intent—something a trained healthcare professional would typically do.
While there are instances where ChatGPT has directly assisted users with health-related inquiries, these depend heavily on the detail and accuracy of the information provided. Experts consistently emphasize the importance of exercising caution. AI systems lack the nuanced understanding and clinical expertise necessary to make reliable diagnoses, especially for rare or complex conditions.
The Limitations of AI in Healthcare
A recent study published in the Genes journal pointed out that even advanced versions like GPT-4.5 struggle with diagnosing rare disorders accurately. AI tools can be valuable supplemental resources but should never replace professional medical judgment. For genuine health concerns, consultation with qualified healthcare providers remains essential. Only trained professionals are equipped to thoroughly evaluate clinical features and interpret diagnostic results in a reliable and safe manner.
In summary, while AI has made significant strides and can be a helpful tool when used responsibly, caution is vital. Its current capabilities are insufficient for autonomous medical decision-making, particularly concerning complex or uncommon health issues. Ultimately, the guidance and oversight of licensed medical practitioners are irreplaceable in ensuring safe and effective healthcare.