About 12% of U.S. Teens Turn to AI for Emotional Support

Mental health professionals warn that general-purpose chatbots are not designed for this use, raising concerns about potential isolating effects.

Published on Feb. 25, 2026

According to a new Pew Research Center report, about 12% of U.S. teens are using AI chatbots like ChatGPT, Claude, and Grok for emotional support or advice, in addition to more common uses like searching for information and getting help with schoolwork. While some teens may find solace in talking to chatbots, mental health experts are wary of the potential isolating effects, as these general-purpose tools are not designed for such sensitive uses.

Why it matters

The growing reliance on AI chatbots for emotional support among teenagers raises concerns about the potential psychological impacts, as these tools are not equipped to provide the type of nuanced, empathetic support that mental health professionals can offer. This trend highlights the need for greater awareness and guidance around the appropriate and safe use of AI technology, especially when it comes to vulnerable populations like young people.

The details

The Pew survey found that while 57% of U.S. teens use AI to search for information and 54% use it to get help with schoolwork, 16% also use it for casual conversation and 12% use it for emotional support or advice. However, there is a discrepancy between teens' self-reported AI usage and their parents' perceptions, with only 18% of parents approving of their teens using AI chatbots for emotional support. Some tech companies, like Character.AI, have taken steps to disable chatbot experiences for users under 18 following public backlash and lawsuits related to teen suicides linked to prolonged chatbot interactions.

  • The Pew Research Center report was published on February 25, 2026.

The players

Pew Research Center

A nonpartisan think tank that provides information on social issues, public opinion, and demographic trends shaping the United States and the world.

Dr. Nick Haber

A Stanford professor researching the therapeutic potential of large language models (LLMs).

Character.AI

A chatbot company that has disabled the chatbot experience for users under the age of 18 following public outcry and lawsuits related to teen suicides.

OpenAI

An artificial intelligence research company that made the decision to sunset its particularly sycophantic GPT-4o model, which had sparked backlash from people who had come to rely on the model for emotional support.

Got photos? Submit your photos here. ›

What they’re saying

“We are social creatures, and there's certainly a challenge that these systems can be isolating. There are a lot of instances where people can engage with these tools and then can become not grounded to the outside world of facts, and not grounded in connection to the interpersonal, which can lead to pretty isolating — if not worse — effects.”

— Dr. Nick Haber, Stanford professor (TechCrunch)

The takeaway

The growing reliance on AI chatbots for emotional support among teenagers highlights the need for greater awareness and guidance around the appropriate and safe use of this technology, as these general-purpose tools are not equipped to provide the nuanced, empathetic support that mental health professionals can offer. This trend raises concerns about the potential isolating and psychological effects on vulnerable populations like young people.