KNOWLEDGE is POWER / REAL NEWS is KEY
New York: Saturday, August 09, 2025
© 2025 U-S-NEWS.COM
Online Readers: 348 (random number)
New York: Saturday, August 09, 2025
Online: 323 (random number)
Join our "Free Speech Social Platform ONGO247.COM" Click Here
A man came down with a rare form of psychosis after ChatGPT gave him very bad dietary advice.

Travel & Lifestyle: 60 Year Old Develops Rare Toxidrome After ChatGPT Advice

🔴 Website 👉 https://u-s-news.com/
Telegram 👉 https://t.me/usnewscom_channel

A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published Tuesday by the American College of Physicians Journals.

The man, who remained anonymous in the case study, told doctors he had eliminated sodium chloride, commonly known as table salt, from his diet after reading about its negative health effects. He said he could only find sources telling him how to reduce salt, but not eliminate it completely.

Inspired by his nutrition studies in college, the man decided to completely eliminate sodium chloride from his diet as a personal experiment, with consultation from Chat GPT, researchers wrote. He maintained multiple dietary restrictions and even distilled his own water at home.

“For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning,” the case study read.

While excess sodium can raise blood pressure and increase the risk of health issues, it is still necessary to consume a healthy amount of it.

The man, who had no psychiatric history, eventually ended up at the hospital, worried that his neighbor was poisoning him. He told doctors he was very thirsty, but paranoid about the water he was offered.

“In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability,” the study read.

A man came down with a rare form of psychosis after ChatGPT gave him very bad dietary advice.

Doctors concluded that the man was suffering from bromism, or bromide toxicity, a condition that is rare today but was more common in the early 20th century. The research noted that bromide was found in several over-the-counter medicines back then and contributed to up to 8% of bromism-related psychiatric admissions at that time.

The hospital treated the man for psychosis and discharged him weeks later. His case highlights the potential pitfalls of using AI to seek medical tips.



Dr. Margaret Lozovatsky, a pediatrician, warned last year that AI often misses crucial context.

“Even if the source is appropriate, when some of these tools are trying to combine everything into a summary, it’s often missing context clues, meaning it might forget a negative,” she told the American Medical Association. “So, it might forget the word ‘not’ and give you the opposite advice.”



Source link



OnGo247
New 100% Free
Social Platform
ONGO247.COM
Give it a spin!
Sign Up Today
OnGo247
New 100% Free
Social Platform
ONGO247.COM
Give it a spin!
Sign Up Today