AI-Generated Advice Leads Man to Hospital with Life-Threatening Condition

A 60-year-old man was recently hospitalized after receiving life-threatening advice from the chatbot ChatGPT. The OpenAI model suggested that he replace sodium chloride, or table salt, with sodium bromide, a toxic substance used in pesticides and as an anticonvulsant for dogs.

The patient’s condition is known as “bromism,” a rare neuropsychiatric disorder caused by excessive consumption of bromides. In the past, bromides were popularly prescribed to treat headaches and other ailments, but their use led to widespread toxicity and even coma.

According to researchers, some 8% of psychiatric hospital admissions in the late 19th century were due to bromism. However, with regulation, the incidence decreased. Unfortunately, this case highlights that ChatGPT can still provide misleading advice, putting users at risk.

A study published in Annals of Internal Medicine found that ChatGPT-3.5 or ChatGPT-4 provided toxic suggestions to the patient. The chatbot failed to recognize the dangers of sodium bromide and its analogs, leading to severe consequences for the man’s health.

This incident serves as a reminder to exercise caution when seeking medical advice from AI-generated sources. As OpenAI’s latest model, GPT-5, claims to be “the best model ever for health,” it is essential to verify information through credible sources before making any decisions about one’s health.

Source: https://futurism.com/man-poisons-himself-chatgpt