Americans Turn to AI for Quick Health Advice, Raising Concerns

Younger adults and lower-income users are using AI chatbots to bridge healthcare access gaps, but experts warn about privacy and accuracy risks.

Apr. 15, 2026 at 7:33pm

A ghostly, translucent X-ray image revealing the internal components of a smartphone displaying a health-related AI chatbot interface, conveying the clinical, analytical nature of using AI for personal health information while hinting at concerns about privacy and the limitations of AI-generated advice.As more Americans turn to AI chatbots for quick health advice, the risks and limitations of relying on these tools for sensitive medical information come into focus.Theodore Today

A recent poll found that about one-quarter of U.S. adults have used an AI tool for health information or advice in the past 30 days, often seeking quick answers before seeing a doctor or to decide if they need medical attention. While most Americans still rely on healthcare providers, some younger and lower-income individuals are using AI to bridge gaps in accessing professional care due to cost or convenience issues. However, there are concerns about the accuracy and privacy of health information provided by AI chatbots.

Why it matters

The rise of AI-powered health advice highlights growing challenges in the U.S. healthcare system, as some patients turn to technology to supplement or replace professional medical care. This trend raises questions about the reliability of AI-generated health information and the privacy implications of sharing personal medical data with these tools.

The details

The West Health–Gallup Center on Healthcare in America poll found that roughly one-quarter of U.S. adults had used an AI tool for health information or advice in the past 30 days. Many say they turn to AI chatbots like ChatGPT to get quick answers about symptoms or lab results before deciding if they need to see a doctor. Experts note that these tools provide an 'upgraded' version of online health searches, summarizing information instead of requiring users to comb through multiple web pages.

  • The West Health–Gallup Center on Healthcare in America poll was conducted in late 2025.
  • The KFF poll referenced in the article was conducted in late February 2026.
  • The Pew Research Center survey was conducted in October 2025.

The players

Tiffany Davis

A 42-year-old resident of Mesquite, Texas who uses ChatGPT to get health advice about the weight-loss injections she is taking.

Rakesia Wilson

A 39-year-old assistant principal in Theodore, Alabama who uses AI chatbots like ChatGPT and Microsoft Copilot to research health issues and decide if she needs to take time off for a doctor's appointment.

Karandeep Singh, MD

The chief health AI officer at the University of California (UC) San Diego Health, who says AI tools are an 'upgraded version' of traditional online health searches.

Bobby Mukkamala, MD

An ear, nose, and throat doctor and the president of the American Medical Association, who says AI should be considered a tool and not a stand-in for professional medical care.

Tamara Ruppart

A 47-year-old director in Los Angeles who says she prefers to contact doctors in her husband's family for health advice rather than using AI chatbots, due to concerns about the accuracy of the information.

Got photos? Submit your photos here. ›

What they’re saying

“I'll just basically let ChatGPT know my status, how I'm feeling. I use it for anything that I'm experiencing.”

— Tiffany Davis

“It'll let me know if something's serious or not.”

— Tiffany Davis

“I just don't necessarily have the time if it's something that I feel is minor.”

— Rakesia Wilson, assistant principal

“It is an assistant but not an expert, and that's why physicians need to be involved in that care.”

— Bobby Mukkamala, MD, ear, nose, and throat doctor, president of the American Medical Association

“Healthcare is something that's pretty serious. And if it's wrong, you could really hurt yourself.”

— Tamara Ruppart, director

What’s next

Experts and policymakers will likely continue to monitor the use of AI for health advice, weighing the benefits of increased access to information against the risks of inaccurate or unreliable guidance. Potential next steps could include developing guidelines or regulations around the use of AI chatbots for medical purposes, as well as efforts to improve access to affordable, high-quality healthcare to reduce reliance on these tools.

The takeaway

The growing use of AI for health advice highlights both the promise and peril of these technologies. While AI chatbots can provide quick answers and bridge gaps in healthcare access, there are significant concerns about the accuracy and privacy implications of relying on these tools for medical information. As AI becomes more integrated into the healthcare system, ensuring the reliability and responsible use of these technologies will be crucial to protecting patient wellbeing.