Baltimore City sues X's AI platform Grok over sexualized deepfakes

Lawsuit alleges Grok allows users to manipulate images into explicit content, including of minors

Mar. 25, 2026 at 8:39pm

Baltimore City has filed a lawsuit against X's artificial intelligence platform Grok, alleging the platform allows users to manipulate images of real people, including minors, into sexually explicit content, which the city says violates its consumer protection laws and at times resembles child abuse.

Why it matters

This lawsuit represents one of the first efforts by a city to take legal action against the proliferation of deepfake technology being used to create non-consensual, sexualized content. It highlights the growing concerns around the potential harms of AI-powered image manipulation and the need for stronger regulations to protect individuals, especially minors, from this type of exploitation.

The details

The complaint filed by Baltimore City states that Grok's platform enables users to take images of real people and manipulate them into sexually explicit content. This includes the ability to create deepfakes of minors in pornographic situations, which the city argues violates its consumer protection laws and resembles child abuse.

  • The lawsuit was filed on March 25, 2026.

The players

Baltimore City

The local government of Baltimore, Maryland, which has filed the lawsuit against X's AI platform Grok.

Grok

An artificial intelligence platform owned by X that allegedly allows users to manipulate images into sexualized deepfakes, including of minors.

X

The parent company of the Grok AI platform that is being sued by Baltimore City.

Got photos? Submit your photos here. ›

What’s next

The judge presiding over the case will determine whether the lawsuit can proceed and if any immediate injunctive relief is warranted to stop the alleged activities on the Grok platform.

The takeaway

This lawsuit highlights the growing concerns around the misuse of deepfake technology and the need for stronger regulations and oversight to protect individuals, especially minors, from having their images exploited in non-consensual and sexually explicit ways.