Applying for a job in 2025 sometimes feels like yelling into the void. But now we know why: the void may be Workday’s AI, and it might be ghosting you for being too seasoned. If you’re over 40, have a solid resume, and still find yourself being passed over for jobs you could do in your sleep—well, you may have just met your match: an algorithm that thinks you peaked in 2002.
Apparently, quite a bit.
A recent federal court decision just greenlit a discrimination lawsuit against Workday, alleging that its AI-powered hiring tools disproportionately reject applicants based on age, race, and disability. The plaintiff, Derek Mobley, says he applied to over 100 jobs using Workday and didn’t get a single interview. He also happens to be Black, over 40, and managing anxiety and depression—three characteristics that, he claims, the AI was all too quick to reject.
Now, it’s one thing to be told you're "overqualified." It's another to be screened out before a human even lays eyes on your resume because a machine decided your birth year made you a risk.
The case, Mobley v. Workday, Inc., is now moving forward as a collective action, potentially opening the door to thousands of other applicants who’ve been filtered out without explanation since 2020. The legal argument? That Workday's algorithm acts as an agent of the employers using it—meaning the software vendor can be held directly liable for the alleged bias.
HR professionals once promised AI would reduce bias. What are we possibly getting instead? Bias at scale—just faster and with a nice dashboard.
Workday’s defense is that it just provides the software, and the employers set the rules. But the court didn't buy that pass-the-buck argument. And rightly so—if you're providing the algorithm that does the filtering, you can't exactly claim you're just the pizza delivery guy when it shows up full of anchovies no one asked for.
Here’s the real problem: AI in hiring decisions isn’t just automating resume sorting; it’s potentially codifying the same old biases in a slick, high-tech wrapper. Worse, these systems often lack transparency, so rejected applicants never really know why they were passed over. It’s like being ghosted by someone you've never met.
Government agencies love efficiency and compliance—but if they’re using AI screening tools like Workday’s, they’d better be prepared for some FOIA requests and EEOC complaints. The public sector is held to high standards for equity and transparency, and lawsuits like this one put a bullseye on any HR department relying blindly on automation.
Hospitals are already understaffed, and hiring biases in AI tools could mean missing out on qualified, experienced candidates. The last thing you want is an ER where no one’s over 35 and everyone’s Googling “how to set a broken bone.”
In an industry built on regulation and risk management, biased AI could trigger not just lawsuits, but regulatory investigations. HR tech isn’t exempt from the compliance frameworks banks live under.
Tech companies walk a fine line. They want to push AI innovation but can’t afford to look hypocritical on DEI. Using biased hiring software while championing diversity is like selling umbrellas in a thunderstorm and claiming you just “distribute them.”
Let’s be clear: AI isn’t the enemy. Poorly designed and unaccountable AI is. Employers and vendors alike need to step up.
Audit your hiring tools. If the AI’s only consistent pattern is ghosting Black, older, or disabled applicants, it’s not "smart"—it’s a legal liability.
Keep humans in the loop. Algorithms are great at sorting data, but humans are still better at judging people. Don't give hiring over to the bots.
Be transparent. If AI is part of your process, disclose it. Let candidates know how it works and what they can do if they believe they were unfairly screened out.
Vendor accountability. If you're using Workday or any similar tool, push for insight into how their models were built. “Trust us” doesn’t cut it anymore.
This lawsuit against Workday isn’t just a legal test—it’s a cultural reckoning. It forces us to ask whether AI in hiring is helping us find the best people, or just scaling up our worst instincts.
Because if experience, resilience, and a 20-year track record are being used against you by a machine, then the system isn’t just broken—it’s been programmed that way.
And for anyone over 40 wondering why their resume keeps disappearing into the digital ether: it’s not you. It’s the algorithm.
Want help evaluating your own AI and hiring tools for bias and compliance? Let’s talk—before the lawyers do.