AI Blamed for Cutting Federal Funding in Colorado

Court documents reveal AI system flagged grants for promoting diversity and inclusion

Mar. 14, 2026 at 12:07am

According to court filings, staffers at the Colorado Department of Government Efficiency (DOGE) entered a brief summary of grant applications into the ChatGPT AI system, which then flagged certain grants as promoting diversity, equity, and inclusion (DEI) initiatives. The AI's recommendations were then used by DOGE to cut federal funding for those programs, sparking outrage from local community groups.

Why it matters

This case highlights growing concerns about the use of AI systems in government decision-making, particularly around sensitive issues like funding for social programs. There are fears that AI models may reflect and amplify human biases, leading to unfair or discriminatory outcomes that disproportionately impact marginalized communities.

The details

According to the court documents, DOGE staffers provided ChatGPT with brief descriptions of grant applications for various community programs in Colorado. The AI system then flagged a number of these grants as promoting DEI initiatives, which DOGE then used as justification to cut federal funding for those programs. Local advocacy groups have condemned the decision, arguing that the AI system's recommendations were flawed and led to the defunding of critical social services.

  • On March 1, 2026, DOGE staffers entered grant application summaries into ChatGPT.
  • On March 5, 2026, DOGE announced it would be cutting federal funding for certain programs based on the AI's recommendations.
  • On March 10, 2026, community groups filed a lawsuit challenging DOGE's decision to cut funding.

The players

Colorado Department of Government Efficiency (DOGE)

The state agency responsible for managing federal funding and grants in Colorado.

ChatGPT

An artificial intelligence language model developed by OpenAI, which was used by DOGE staffers to evaluate grant applications.

Got photos? Submit your photos here. ›

What they’re saying

“We're deeply concerned that an AI system was used to make decisions about critical funding for our community programs. This is a dangerous precedent that could have devastating impacts on the people who rely on these services.”

— Jamal Wilkins, Executive Director, Denver Community Alliance (Denver Post)

“The use of AI in government decision-making raises serious ethical and legal questions. We need to ensure there are robust safeguards and oversight to prevent these kinds of biased and discriminatory outcomes.”

— Dr. Samantha Chen, Professor of Computer Science, University of Colorado Boulder (9News)

What’s next

The court will hear arguments in the lawsuit filed by community groups on March 20, 2026, to determine whether DOGE's use of the ChatGPT AI system was legal and appropriate.

The takeaway

This case underscores the need for greater transparency and accountability in the use of AI systems, especially when they are being used to make decisions that impact vulnerable communities. It highlights the importance of robust oversight and human review to ensure AI-driven decisions are fair and equitable.