ChatGPT Risks Severe Hallucinations, Psychosis in Users

A 60-year-old man was hospitalized after using ChatGPT for dietary advice, leading to severe psychiatric symptoms and physical issues. The AI suggested swapping table salt with sodium bromide, a compound with industrial uses but no medical benefits.

The man had conducted an experiment replacing sodium chloride with sodium bromide, despite having no prior psychiatric history. After 24 hours of hospitalization, he developed hallucinations and was treated with antipsychotics and fluids.

A case study published in Annals of Internal Medicine Clinical Cases revealed the man suffered from bromism, a toxic syndrome caused by overexposure to bromide or its cousin bromine. The incident highlights the dangers of relying on AI for medical advice and the need for critical evaluation of scientific information.

Experts warn that ChatGPT can generate inaccurate information and fuel misinformation. A recent survey found that 35% of Americans use AI to learn about their health, despite warnings from OpenAI against using the tool for diagnosis or treatment. Mental health experts have sounded the alarm about “ChatGPT psychosis,” a phenomenon where deep engagement with chatbots fuels severe psychological distress.

The incident serves as a reminder to approach online health resources with caution and to verify information through trusted medical professionals.

Source: https://nypost.com/2025/08/11/health/chatgpt-advice-lands-a-man-in-the-hospital-with-hallucinations