News
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
7h
India Today on MSNMan asked ChatGPT about salt alternatives. It led to a rare, dangerous poisoning
A 60-year-old man was hospitalised with bromide poisoning after replacing salt with sodium bromide following ChatGPT advice.
A 60-year-old man ended up in the ER after becoming convinced his neighbor was trying to poison him. In reality, the culprit ...
Man suffers hallucinations, paranoia and bromism symptoms after following ChatGPT's advice to use sodium bromide instead of ...
A 60-year-old man developed a rare medical condition after ChatGPT advised him on alternatives to table salt, according to a ...
A 60-year-old man ended up on involuntary psychiatric hold after accidentally poisoning himself by misunderstanding a ChatGPT ...
A new case warns that relying on AI for diet advice can be dangerous, as a man replaced salt with sodium bromide and ...
A case report has described an incident in which a 60-year-old man seeking to make a dietary change consulted ChatGPT and ...
Doctors diagnosed him with bromism, a toxic syndrome caused by overexposure to bromide, after he also reported fatigue, acne, ...
A 60-year-old man was hospitalized with toxicity and severe psychiatric symptoms after asking ChatGPT for tips on how to ...
After having “auditory and visual hallucinations,” the man tried to escape the hospital, forcing the staff to place him on ...
A 60-year-old man wound up in the hospital after seeking dietary advice from ChatGPT and accidentally poisoning himself.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results