Instagram to Alert Parents on Teen Self-Harm Searches

Meta faces trials over addiction and child safety claims as it introduces new parental notification feature.

Published on Mar. 3, 2026

Instagram announced it will begin notifying parents when their teenage children repeatedly search for terms associated with suicide or self-harm on the platform. The alerts will only be sent to parents and teens enrolled in Instagram's parental supervision tools. The company said this is the first time it will proactively inform parents about their child's search activity for harmful material. The rollout comes as Meta faces two trials in the United States over claims its platforms deliberately addict and harm minors.

Why it matters

The new alerts are part of Instagram's efforts to address concerns about the platform's impact on teen mental health and safety. However, some advocacy groups have criticized the measure, arguing that Instagram should focus on redesigning its systems to be age-appropriate rather than shifting responsibility to parents. The trials Meta faces in the US also highlight the broader legal and regulatory scrutiny the company is under regarding the potential harms of its platforms, especially for young users.

The details

The alerts will be triggered when a teen repeatedly attempts to search for terms related to suicide or self-harm within a short period of time. When the threshold is reached, parents will receive a notification through email, text message or WhatsApp, as well as an in-app alert on Instagram. The company said it analyzed Instagram search behavior and consulted experts before selecting the threshold for alerts, aiming to avoid sending unnecessary notifications. Under existing policy, searches clearly associated with suicide or self-harm are already blocked, and users are directed to support resources.

  • The new alerts will become available next week in the United States, the United Kingdom, Australia and Canada.
  • Meta is also developing similar parental notifications related to artificial intelligence interactions, which it says it will share more about in the coming months.

The players

Meta

The parent company of Instagram, which is facing trials in the United States over claims its platforms deliberately addict and harm minors.

Adam Mosseri

The chief executive of Instagram, who has testified that while social media could cause some harm, the company carefully tests features used by young people before releasing them.

Mark Zuckerberg

The CEO of Meta, who has disputed claims that the company's platforms cause addiction.

Molly Rose Foundation

A UK-based charity established by the family of Molly Russell, who took her own life in 2017 at age 14 after viewing self-harm and suicide content on platforms including Instagram.

Fairplay

A nonprofit organization that has criticized Instagram for introducing the new alerts while on trial for addicting and harming kids.

Got photos? Submit your photos here. ›

What they’re saying

“Our goal is to empower parents to step in if their teen's searches suggest they may need support. We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall.”

— Meta (Blog post)

“Instagram is introducing the feature while on trial in two states for addicting and harming kids. They're shifting responsibility to parents instead of addressing flaws in how its algorithms and platforms are designed.”

— Josh Golin, Executive Director, Fairplay

“The announcement is fraught with risk and forced disclosures could do more harm than good. While every parent would want to know if their child is struggling, the notifications could leave parents panicked and unprepared for difficult conversations.”

— Andy Burrows, Chief Executive, Molly Rose Foundation

What’s next

Meta said it will continue monitoring the new alert system and gathering feedback to improve it. The company is also developing similar parental notifications related to AI interactions, which it plans to share more about in the coming months.

The takeaway

The new Instagram alerts are a step towards addressing concerns about the platform's impact on teen mental health, but advocacy groups argue that Meta should focus on redesigning its systems to be age-appropriate rather than shifting responsibility to parents. The trials the company faces in the US also highlight the broader legal and regulatory scrutiny it is under regarding the potential harms of its platforms, especially for young users.