Man poisons himself after taking ChatGPT’s dietary advice

  • A 60-year-old man was hospitalized for bromide poisoning
  • The man asked ChatGPT for alternatives to salt
  • Bromide toxicity was far more common in the 20th century
The logo for OpenAI, the maker of ChatGPT, appears on a mobile phone

The logo for OpenAI, the maker of ChatGPT, appears on a mobile phone, in New York, Jan. 31, 2023. (AP Photo/Richard Drew, File)

Want to see more of NewsNation? Get 24/7 fact-based news coverage with the NewsNation app or add NewsNation as a preferred source on Google!

(NewsNation) — A 60-year-old man wound up in the hospital after seeking dietary advice from ChatGPT and accidentally poisoning himself.

According to a report published in the Annals of Internal Medicine, the man wanted to eliminate salt from his diet and asked ChatGPT for a replacement.

The AI platform recommended sodium bromide, a chemical often used in pesticides, as a substitute. The man then purchased the sodium bromide online and replaced it with salt for three months.

The man eventually went to the hospital, fearing his neighbor was trying to poison him. Doctors discovered he was suffering from bromide toxicity, which caused paranoia and hallucinations.

Bromide toxicity was more common in the 20th century when bromide salts were used in various over-the-counter medications. Cases declined sharply after the U.S. Food and Drug Administration phased out bromide between 1975 and 1989.

The case highlights the dangers of relying on ChatGPT for complex health decisions without sufficient understanding or proper AI literacy.

AI

Copyright 2026 Nexstar Broadcasting, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

AUTO TEST CUSTOM HTML 20260112181412