HHS Is Using AI Tools From Palantir to Target ‘DEI’ and ‘Gender Ideology’ in Grants
Summary
Since March 2025 the Department of Health and Human Services’ Administration for Children and Families (ACF) has been using AI tools from Palantir and the startup Credal AI to screen and audit grants, grant applications and job descriptions for references to DEI and “gender ideology”. The systems generate initial flags and priorities; ACF staff conduct a final review. Contracts and payments — including >$35m to Palantir from HHS and roughly $750k to Credal — appear in federal records but don’t clearly describe this DEI-targeting work. The effort implements two executive orders from January 2025 (EO 14151 and EO 14168) that restrict DEI programmes and narrow federal definitions of sex and gender. The rollout has coincided with frozen grants, paused research, agency reassignments and nonprofit changes across the US.
Key Points
- Palantir and Credal AI are being used inside HHS/ACF to flag language related to DEI and ‘gender ideology’ in grants and position descriptions.
- The AI performs initial reviews and flags items; human ACF staff perform a final determination.
- Federal payment records show significant spending on these contractors but omit explicit mentions of DEI-targeting in descriptions.
- The work enacts Executive Orders 14151 and 14168, which ban federal support for DEI programmes and define sex/gender in narrowly biological terms.
- Similar policies have led to paused research, nearly $3bn in NSF/NIH funds affected, layoffs and nonprofits stripping DEI language.
- Palantir’s expanding federal footprint — including large contracts with ICE and other agencies — raises transparency, civil-rights and surveillance concerns.
Why should I read this?
Quick and blunt: a private AI firm is quietly helping the government police what counts as acceptable language and funding. It’s reshaping research, services and jobs with little public disclosure. If you care about who controls public money, civil liberties, or how AI gets used in government, this is worth your two minutes.
Context and relevance
This story sits at the crossroads of AI governance, public contracting and civil-rights policy. It shows how high-level executive orders are being operationalised via commercial AI, highlights gaps in procurement transparency, and illustrates real-world consequences — from frozen grants and paused studies to staff reassignments and altered nonprofit missions. It also links to broader debates over Palantir’s role in immigration and law enforcement, and the wider trend of algorithmic tools enforcing political directives.