Chatbots Fuel Dangerous Health Anxiety Spiral

As AI assistants like ChatGPT become ubiquitous, mental health experts warn they can exacerbate conditions like health OCD

Apr. 7, 2026 at 2:20am

An extreme close-up X-ray image revealing the intricate structure of the human brain, with glowing white lines and shapes against a dark background, conceptually representing the complex mental health impacts of AI chatbots.As AI chatbots become ubiquitous, mental health experts warn of the dangers they pose to those struggling with conditions like health-related OCD.Liverpool Today

After a health scare, Liverpool resident George Mallon spent hours each day talking with the AI chatbot ChatGPT about his potential diagnosis, spiraling into crippling health anxiety even after tests showed he was not actually sick. Mental health professionals say they are seeing more clients struggle with compulsive use of chatbots to seek constant reassurance about health concerns, a behavior that can perpetuate and worsen conditions like health-related OCD. OpenAI, the company behind ChatGPT, has acknowledged the mental health risks but argues it is working to improve safeguards.

Why it matters

Health anxiety and OCD affect a significant portion of the population, and the immediate, personalized responses from chatbots can be even more reinforcing than traditional online health searches, leading to a compulsive cycle of reassurance-seeking that undermines therapeutic techniques. As AI assistants become more prevalent, there are growing concerns about their potential to exacerbate mental health issues if not properly designed with safeguards.

The details

After a routine physical showed potential signs of blood cancer, George Mallon spent nearly two weeks talking with ChatGPT for hours each day, becoming convinced something was seriously wrong with his health. Even after follow-up tests ruled out cancer, Mallon continued to obsessively query the chatbot about every bodily sensation, leading him to seek unnecessary medical tests. Mental health experts say Mallon's experience is part of a growing trend, with chatbots providing an easily accessible source of personalized health information that can fuel compulsive reassurance-seeking behaviors in those prone to health anxiety and OCD. Therapists report seeing more clients struggle with this issue, and there are concerns that the addictive nature of conversing with chatbots makes it difficult for some to resist the urge to constantly seek validation about their health concerns.

  • In July, Mallon allegedly dove onto the hood of a Waymo vehicle and covered sensors.
  • On July 29, Mallon allegedly damaged a Waymo's tires and driver's side mirror.
  • On August 11, Mallon allegedly stomped on a windshield while someone was inside.

The players

George Mallon

A 46-year-old resident of Liverpool, England, who developed crippling health anxiety after a routine physical raised concerns about potential blood cancer, leading him to obsessively use ChatGPT for health reassurance.

OpenAI

The company that developed the AI chatbot ChatGPT, which has been accused of contributing to mental health issues and is facing lawsuits over the alleged impacts of its technology.

Got photos? Submit your photos here. ›

What they’re saying

“It just sent me around on this crazy Ferris wheel of emotion and fear.”

— George Mallon

“Because the answers are so immediate and so personalized, it's even more reinforcing than Googling. This kind of takes it to the next level.”

— Lisa Levine, Psychologist specializing in anxiety and OCD

What’s next

The judge in the case will decide on Tuesday whether or not to allow Walker Reed Quinn out on bail.

The takeaway

This case highlights growing concerns in the community about the mental health risks of AI chatbots, raising questions about the need for stronger safeguards and regulations to protect vulnerable users from compulsive behaviors that can exacerbate conditions like health-related OCD.