Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World

AI Medical Advice Risks: Rare Condition Develops After ChatGPT Use

August 12, 2025 Dr. Jennifer Chen Health

ChatGPT Health Advice Led to Rare Toxicity, Journal Warns

Table of Contents

  • ChatGPT Health Advice Led to Rare Toxicity, Journal Warns
    • From Salt-Free ​Diet to Bromide Poisoning
    • Symptoms and Misdiagnosis
    • AI’s⁢ Potential ‌to Spread Misinformation
    • OpenAI’s Response and the Rise of GPT-5
    • A Cautionary Tale for the Age of AI

A man developed a​ rare and possibly dangerous condition – bromide‌ toxicity, known as bromism – ‌after seeking dietary advice from ChatGPT, according​ to a recent report published in the Annals of Internal Medicine. The case serves as a stark warning about the risks of‌ relying on artificial intelligence for health data.

From Salt-Free ​Diet to Bromide Poisoning

The 60-year-old patient consulted ChatGPT⁣ after becoming concerned about the negative effects of sodium chloride, or table salt. He inquired about ⁤eliminating chloride from his‌ diet altogether. ⁤The chatbot suggested substituting chloride with bromide, despite acknowledging the substitution “likely‍ for other purposes, such as cleaning.” ​

Over three months, the ‌man began taking sodium bromide, leading to the development​ of bromism. This condition, once common in‌ the early 20th century and linked to a significant number ⁣of psychiatric admissions, had ⁤largely faded from medical awareness.

Symptoms and Misdiagnosis

The patient initially presented at a hospital with paranoia, claiming his​ neighbour was poisoning him, and reported multiple dietary restrictions. ⁣He exhibited‌ excessive thirst but was suspicious of offered water.He attempted to leave the hospital within 24 hours and, after being sectioned,⁣ was treated ⁤for psychosis.

Once stabilized, doctors identified a ​cluster of symptoms indicative of bromism, including facial acne, excessive thirst, and insomnia. The connection ⁤to his self-directed bromide supplementation,prompted by ChatGPT’s advice,was then uncovered.

AI’s⁢ Potential ‌to Spread Misinformation

The University of Washington researchers who reported the case highlighted the dangers of unchecked ⁤AI-generated health advice. They found ⁤that when they directly ⁤asked ChatGPT about chloride alternatives, the chatbot again suggested bromide ‌without offering ⁣a health warning or inquiring about the reason for the question ‍- a crucial step⁣ a medical professional ​would​ take.

“This case highlights how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes,”⁣ the authors wrote. They cautioned that AI chatbots can “generate scientific inaccuracies, lack‌ the ability to critically discuss results, ‌and ultimately fuel the spread of misinformation.”

OpenAI’s Response and the Rise of GPT-5

The incident comes as OpenAI, the creator of ChatGPT, recently launched an upgraded version ⁣powered by the GPT-5 model. The company claims this new iteration boasts improved capabilities in handling health-related queries and proactively‌ flagging potential concerns, such as serious physical or mental illness.

However, OpenAI emphasizes that ‍ChatGPT remains not a substitute for professional medical advice. Notably, the patient in ‍this case appears to have used an earlier version ‍of the chatbot, ​prior to the GPT-5 upgrade.

A Cautionary Tale for the Age of AI

This case underscores the critical need for caution when using AI tools for health information. While AI has the potential to revolutionize healthcare,⁤ relying on it for self-diagnosis or‍ treatment ⁢can ‌have serious, even ⁤dangerous, ⁢consequences.⁢ The Annals of Internal Medicine report serves as a timely reminder that professional medical guidance ‍remains paramount.

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

artificial intelligence AI, disinformation, Health

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Copyright Notice
  • Disclaimer
  • Terms and Conditions

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service