AI is stress-testing hiring — and hurting trust

AI is stress-testing hiring — and hurting trust

Summary

Alison Lands (Jobs for the Future) argues that AI has sped up hiring but chipped away at trust and clarity. Generative AI on the candidate side has increased application volumes, while employers use AI to triage, screen and predict fit — creating an “AI arms race” that moves fast but lacks consistent standards.

Key research from the University of Phoenix shows growing adoption of AI in hiring and widespread concern: many stakeholders feel AI affects objectivity, and only a minority audit tools for fairness. Organisations are shifting towards skills-based hiring in principle, but many lack standardised practices, training and interviewer readiness. Lands recommends operationalising skills-based hiring, auditing AI for fairness, maintaining human oversight and establishing ongoing governance to restore trust and make AI useful.

Key Points

  • AI has increased both candidate output (more applications via generative tools) and employer reliance on automated triage, accelerating hiring pipelines.
  • University of Phoenix research: significant concern about AI’s objectivity — many candidates and hiring stakeholders doubt fairness.
  • Only around a third of organisations using AI audit those tools for fairness, creating a mismatch between risk and responsibility.
  • Most employers say they’re shifting toward skills-based hiring (about 82%), but many lack standardised processes and assessor training.
  • When AI adoption outpaces governance and training, bias and inconsistency can scale rather than shrink.
  • Lands’ remedy: operationalise skills-based hiring end-to-end, audit and validate AI for fairness, keep human oversight and build continuous governance and feedback loops.

Why should I read this?

Quick and practical: if you run or design hiring, this explains why slapping AI onto old processes can make things worse. It tells you what to fix first (skills standards, training, audits) so your shiny tech doesn’t tank candidate trust. Short version — don’t blame the tool, fix the system.

Author style

Punchy. Alison Lands calls out an urgent mismatch: AI is fast, hiring systems are not. If your organisation is experimenting with AI for recruitment, this piece is a wake-up call — it pushes leaders to treat fairness, training and governance as non-negotiable or risk losing trust at scale.

Context and relevance

As AI adoption in recruitment becomes widespread, this article is relevant to CHROs, talent leaders and HR technologists wrestling with volume, fairness and legal risk. It connects two major trends: the rise of generative AI on the candidate side and the rush to automatable hiring tools on the employer side. The takeaway aligns with broader industry moves toward skills-based hiring and the regulatory scrutiny on AI-driven decisions — a timely reminder that governance, transparency and human oversight matter more than ever.

Source

Source: https://www.hrdive.com/news/ai-is-stress-testing-hiring-trust-is-failing/816893/