Teenagers Sue Elon Musk's xAI Over Sexually Explicit Images

Lawsuit claims AI-generated content featured real photos of minors in explicit poses

Mar. 20, 2026 at 4:48am

Three high school students in Tennessee have filed a lawsuit against Elon Musk's artificial intelligence company xAI, claiming the company's image generation tools were used to create sexually explicit images of them as minors. The lawsuit seeks class-action status to represent thousands of alleged victims and alleges xAI knew its Grok chatbot could produce such content but released it anyway.

Why it matters

This case highlights the growing concerns around the potential misuse of generative AI technology, especially when it comes to the exploitation of minors. It raises questions about the responsibility of AI companies to implement robust safeguards and content moderation to prevent such abuses.

The details

According to the lawsuit, the three teenage plaintiffs discovered that real photos of them, including from a homecoming event and a high school yearbook, had been used to generate sexually explicit images. The perpetrator, who was later arrested, had distributed these AI-generated images on social media platforms. The lawsuit claims xAI's Grok chatbot was used to create the explicit content, and that the company was aware of this capability but released the product anyway.

  • In December 2025, one of the plaintiffs was anonymously alerted that sexually explicit images of her were being distributed online.
  • In late December 2025, the perpetrator was arrested and his phone was confiscated, revealing the distribution of explicit images of multiple minors.

The players

xAI

Elon Musk's artificial intelligence company that developed the Grok chatbot, which the lawsuit claims was used to generate sexually explicit images of minors.

Jane Doe 1, Jane Doe 2, and Jane Doe 3

Three high school students in Tennessee who are the plaintiffs in the lawsuit, seeking class-action status to represent thousands of alleged victims.

Got photos? Submit your photos here. ›

What’s next

The judge in the case will decide on whether to grant class-action status to the lawsuit, which could represent thousands of alleged victims.

The takeaway

This case highlights the urgent need for AI companies to implement robust safeguards and content moderation to prevent the exploitation of minors, as well as the lasting trauma and harm that can be caused by the non-consensual creation and distribution of sexually explicit content.