HHS Is Using AI Tools From Palantir to Target ‘DEI’ and ‘Gender Ideology’ in Grants
Summary
Since March 2025 the US Department of Health and Human Services (HHS) — specifically its Administration for Children and Families (ACF) — has been using AI tools from Palantir and the startup Credal AI to screen and flag grant applications, existing grants and job descriptions for references to DEI (diversity, equity and inclusion) and what the administration calls ‘gender ideology’. The AI flags materials for review; ACF staff carry out a final human review. Contracts and payments to Palantir (millions of dollars) and Credal AI (around $750,000) appear in federal procurement records, but descriptions do not explicitly state the DEI/gender-targeting work. The effort implements Executive Orders 14151 and 14168, issued on the administration’s first day, which direct agencies to purge or prevent use of federal funds for DEI-related language and to define sex strictly as biological male/female. The reporting places this AI-driven auditing in the wider context of grant freezes, removals of LGBTQ and trans references across agencies and increased Palantir federal business, including controversial work for ICE.
Key Points
- HHS/ACF uses Palantir software to create lists of position descriptions that might violate new executive orders on DEI and ‘gender ideology’.
- Credal AI provided a ‘GenAI’ platform that helps review grant submissions and generate initial flags; humans perform final decisions.
- Procurement records show substantial HHS payments to Palantir (tens of millions) and to Credal AI, but public contract descriptions omit the DEI/gender-targeting detail.
- The work enacts Executive Orders 14151 and 14168, which ban federal promotion of DEI concepts and redefine ‘sex’ as an immutable biological classification.
- The policy and AI screening have coincided with widespread grant freezes/cuts, programme and language removals across federal agencies, and organisational changes to avoid losing funding.
- Palantir’s federal revenue has risen sharply under the current administration; the company also holds major, contested contracts with ICE and other agencies.
- HHS, ACF, Palantir and Credal AI did not respond to requests for comment in the reporting.
Context and relevance
This is part of a broader, rapid shift in how federal agencies use automated tools to enforce political directives. The story links procurement, AI auditing and civil‑service implementation of controversial executive orders — showing how algorithmic tools can scale policy enforcement across grants and job postings, with tangible effects on funding, research and services for marginalised groups.
Why should I read this?
Because if you care about how AI is being used by government — not just to analyse data but to enforce political priorities that reshape who gets funding and which words are allowed — this is a prime example. It’s quick, worrying and shows the tech+policy pipeline in action: contracts, automated flags, then human enforcement. Worth five minutes to know what’s changing and who’s getting paid to make it happen.