Experts Warn About Relying Too Heavily on AI Chatbots for Health Advice

Tech companies are pushing new health chatbots, but doctors say they should not replace professional medical care.

Published on Mar. 2, 2026

As AI chatbots become more prevalent for answering health questions, experts are cautioning users to approach them with "a degree of healthy skepticism." While the latest chatbots can provide personalized information based on a user's medical history, they are not a substitute for consulting a doctor, especially for serious or urgent health concerns. Experts recommend using chatbots as a supplemental resource, not the sole basis for making major medical decisions.

Why it matters

The rise of AI-powered health chatbots raises concerns about patient privacy, the accuracy of medical advice, and over-reliance on technology instead of professional care. As more people turn to these tools, it's important for users to understand their limitations and risks.

The details

Tech companies like OpenAI and Anthropic have introduced new health-focused chatbots that can analyze users' medical records, wellness app data, and wearable device information to provide personalized health advice. However, experts caution that these programs are not a replacement for seeing a doctor. The chatbots can sometimes provide inaccurate or incomplete information, and anything shared with them is not protected by medical privacy laws. Independent testing has found that while the chatbots excel at identifying conditions in written scenarios, they struggle more in real-time interactions with people.

  • In January 2026, OpenAI introduced ChatGPT Health.
  • In 2024, a study by Oxford University found communication problems between people and AI chatbots in health scenarios.

The players

OpenAI

An artificial intelligence research company that has introduced ChatGPT Health, a new version of its chatbot designed to provide health and medical advice.

Anthropic

An AI company that offers similar health-focused chatbot features as part of its Claude chatbot program.

Dr. Robert Wachter

A medical technology expert at the University of California, San Francisco who sees potential benefits in using AI chatbots responsibly, but also recommends consulting multiple chatbots for a second opinion.

Dr. Lloyd Minor

The dean of Stanford University's medical school, who cautions that users should never rely solely on an AI chatbot for major medical decisions and must understand the differences in privacy standards compared to traditional healthcare providers.

Adam Mahdi

The lead author of a 2024 Oxford University study that found communication problems between people and AI chatbots in health scenarios, despite the chatbots' ability to correctly identify conditions in written form.

Got photos? Submit your photos here. ›

What they’re saying

“The alternative often is nothing, or the patient winging it. And so I think that if you use these tools responsibly, I think you can get useful information.”

— Dr. Robert Wachter, Medical technology expert, University of California, San Francisco

“If you're talking about a major medical decision, or even a smaller decision about your health, you should never be relying just on what you're getting out of a large language model.”

— Dr. Lloyd Minor, Dean, Stanford University Medical School

“When someone is uploading their medical chart into a large language model, that is very different than handing it to a new doctor. Consumers need to understand that they're completely different privacy standards.”

— Dr. Lloyd Minor, Dean, Stanford University Medical School

What’s next

Experts recommend that users approach AI chatbots with caution, use them as a supplemental resource rather than a replacement for professional medical care, and consult multiple chatbots to get a second opinion on health-related questions.

The takeaway

While AI-powered health chatbots offer the potential for more personalized and accessible medical advice, they should not be viewed as a substitute for consulting a qualified healthcare provider, especially for serious or urgent health concerns. Users must understand the limitations and privacy risks associated with these tools.