New Mexico Jury Finds Meta Violated Consumer Protection Law

Landmark decision rules social media giant prioritized profits over child safety

Mar. 25, 2026 at 2:18am

A New Mexico jury found that social media conglomerate Meta, the parent company of Instagram, Facebook, and WhatsApp, violated the state's consumer protection law by prioritizing profits over the safety of children. The jury determined Meta hid what it knew about the dangers of child sexual exploitation and the negative impacts on child mental health on its platforms.

Why it matters

This landmark decision is one of the first trials involving social media platforms and their impacts on children. It could set a precedent for future lawsuits against tech companies over child safety and mental health issues on their platforms.

The details

After a nearly seven-week trial, the jury sided with state prosecutors who argued that Meta violated parts of New Mexico's Unfair Practices Act. The jury found that Meta made false or misleading statements and engaged in 'unconscionable' trade practices that took advantage of the vulnerabilities and inexperience of children. Jurors determined there were thousands of violations, each counting separately toward a potential penalty of $375 million.

  • The trial lasted nearly seven weeks before the jury reached its verdict on March 24, 2026.

The players

Meta

The social media conglomerate that owns Instagram, Facebook, and WhatsApp.

New Mexico State Prosecutors

The state prosecutors who brought the case against Meta, arguing the company prioritized profits over child safety.

Got photos? Submit your photos here. ›

What’s next

The judge will now determine the final penalty amount Meta must pay for the thousands of violations found by the jury.

The takeaway

This case highlights the growing legal and regulatory scrutiny social media platforms face over their impacts on children's mental health and safety. It could pave the way for more lawsuits and stricter oversight of tech companies' practices when it comes to protecting young users.