- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
Lawsuit Alleges xAI's Grok AI Generated Child Sexual Abuse Material
Three Tennessee-based plaintiffs sue Elon Musk's AI company over its image generation model
Mar. 19, 2026 at 4:03pm
Got story updates? Submit your updates here. ›
A proposed class-action lawsuit filed in federal court in San Jose, California, alleges that Elon Musk's artificial intelligence company, xAI, recklessly designed its Grok AI model to enable the creation and distribution of child sexual abuse material (CSAM). The lawsuit claims that Grok's image generation feature was used to produce sexualized images of minors, which were then shared on a Discord server. The plaintiffs, identified as Jane Does, are suing xAI for the creation and distribution of the alleged CSAM and are demanding a jury trial.
Why it matters
This case highlights the potential risks and harms associated with the development of powerful AI systems, especially when it comes to the exploitation of minors. The lawsuit raises concerns about the responsibility of tech companies to ensure their products and services are not being used for illegal and abusive purposes.
The details
According to the 44-page complaint, a teenager eventually obtained access to a Discord server where they found images and videos of at least 18 other minors, many of whom they recognized from their school. The lawsuit alleges that xAI recklessly designed Grok to enable such abuse, and then, amid public outcry, restricted the technology to paid subscribers and third-party companies rather than fix the problem. The lawsuit claims that Grok was used to generate sexualized images of minors, including one image that allegedly showed six young girls wearing micro bikinis and remained publicly available on X as of January 15, 2026.
- The new allegations were detailed in a complaint filed on Monday, February 12, 2026, in federal court in San Jose, California.
- The second plaintiff in the lawsuit said she was informed by law enforcement on February 12, 2026, that she had also been targeted.
- The alleged perpetrator was arrested in December 2025.
The players
xAI
Elon Musk's artificial intelligence company that developed the Grok AI model.
Elon Musk
The founder of xAI.
Jane Does
Three Tennessee-based plaintiffs who are suing xAI over the Grok AI model.
Vanessa Baehr-Jones
An attorney from Baehr-Jones Law, one of the firms representing the plaintiffs.
Annika K. Martin
An attorney from Lieff Cabraser, one of the firms representing the plaintiffs.
What they’re saying
“No one should have to live with the fear that these survivors now carry with them, but I am inspired by their strength and clarity of purpose in bringing this lawsuit on behalf of themselves and other minors in the Class.”
— Vanessa Baehr-Jones, Attorney, Baehr-Jones Law
“These are children whose school photographs and family pictures were turned into child sexual abuse material by a billion-dollar company's AI tool and then traded among predators. Elon Musk and xAI deliberately designed Grok to produce sexually explicit content for financial gain, with no regard for the children and adults who would be harmed by it.”
— Annika K. Martin, Attorney, Lieff Cabraser
What’s next
The judge in the case will decide whether to allow the lawsuit to proceed as a class action.
The takeaway
This case highlights the urgent need for stronger regulations and oversight of AI systems, particularly when it comes to the protection of minors and the prevention of the creation and distribution of child sexual abuse material. Tech companies must be held accountable for the potential harms caused by their products and services.




