Workday's AI screening tool faces class action for age discrimination; class conditionally certified
A federal judge conditionally certified a class action against Workday alleging its AI-powered applicant screening tools systematically discriminated against job seekers over 40 in violation of the ADEA. Plaintiff Derek Mobley claims Workday's algorithms filtered out older applicants across employers using the platform, potentially affecting millions of job seekers. Workday processed over 1.1 billion applications in fiscal year 2025 alone. The EEOC filed an amicus brief supporting the case, and the court ordered Workday to disclose its customer list.
Incident Details
Tech Stack
References
Workday is an enterprise software company that provides human resources and financial management tools to more than 11,000 organizations worldwide. Among its offerings is an AI-powered applicant screening system that evaluates job applications and generates recommendations about which candidates should advance in the hiring process. The system can score, sort, rank, and screen applicants without direct human involvement at the screening stage. In fiscal year 2025, Workday processed over 1.1 billion job applications through its platform.
In February 2023, Derek Mobley filed a lawsuit against Workday in the US District Court for the Northern District of California. Mobley, who is Black, over 40, and disabled, alleged that since 2017 he had applied for more than 100 jobs at companies using Workday's screening tools and was rejected every single time. His complaint alleged discrimination on the basis of race (under Title VII and Section 1981), age (under the Age Discrimination in Employment Act, or ADEA), and disability (under the ADA).
The lawsuit did not name any individual employer. Mobley's theory was that Workday itself, as the operator of the screening system that made or influenced the rejection decisions, could be held liable as an agent of the employers who subscribed to its platform. This was a novel legal argument. Traditionally, employment discrimination claims are brought against employers. Treating a software vendor as an agent subject to the same liability had not been tested at this scale.
The legal path
Workday responded with a motion to dismiss, arguing that it was not an employer and did not make employment decisions. In January 2024, Judge Rita F. Lin dismissed the case - not because Workday's software did not discriminate, but because the original complaint did not offer enough evidence to classify Workday as an "employment agency" under the relevant statutes. The court also noted gaps in Mobley's claims about what personal details he had provided and his qualifications for the positions he sought.
In February 2024, Mobley filed an amended complaint, addressing the court's concerns and building a more detailed case for treating Workday as an agent of the employers whose hiring decisions it influenced. In April 2024, the Equal Employment Opportunity Commission filed an amicus brief supporting the plaintiff, arguing that Workday's software could enable discriminatory practices by allowing employers to exclude applicants from protected categories.
The EEOC's involvement was notable. The agency does not routinely file amicus briefs in private employment litigation. Its decision to weigh in signaled that federal regulators viewed the question of AI vendor liability in hiring as significant enough to warrant their input. The EEOC argued that Workday should face claims of bias in its algorithmic screening system and that the company's tools could violate Title VII of the Civil Rights Act.
In July 2024, Judge Lin denied Workday's second motion to dismiss, allowing the claims to proceed. The court ruled that Workday could potentially be held liable as an "agent" of the employers who rejected Mobley's applications. With that ruling, the case moved past threshold questions about whether Workday could be sued at all and into substantive territory about what its AI actually did.
Conditional certification
On May 16, 2025, Judge Lin granted Mobley's motion for preliminary certification and allowed the case to proceed as a nationwide collective action on the ADEA age discrimination claim. The certified class covers all individuals aged 40 and over who applied for jobs through Workday's platform from September 24, 2020 through the present and were denied employment recommendations.
Four additional plaintiffs joined Mobley in the motion. Like Mobley, they alleged applying for hundreds of jobs through Workday's system and being rejected almost every time without an interview, allegedly because of their age.
Judge Lin's ruling focused on the central question: whether Workday's AI recommendation system has a disparate impact on applicants over 40. She held that at the preliminary certification stage, where the evidentiary burden is low, Mobley had "substantially alleged the existence of a unified policy" - specifically, the use of Workday's AI system to screen applicants. The fact that different employers used the system for different positions across different industries did not, in the court's view, prevent the class from being similarly situated. The algorithm was the common thread.
This reasoning has broad implications. If courts treat a shared screening algorithm as a "unified policy" for class certification purposes, it becomes significantly easier to bring collective action claims against AI vendors whose tools are used by many employers. Workday's scale works against it here: the more employers use the system, the larger the potential class.
Workday raised practical objections about the size of the proposed class, arguing that it could include millions of applicants and that issuing class notice would be impractical. Judge Lin responded that "allegedly widespread discrimination is not a basis for denying notice." She further suggested that if traditional notice methods were insufficient, the court could consider issuing notice via social media or through Workday's own platforms. This opened a door that future plaintiffs in similar cases may walk through.
The agent theory
The most consequential legal question in the case is whether an AI software vendor can be treated as an "agent" of the employers who use its tools, making it jointly liable for discriminatory outcomes. Traditionally, companies like Workday would argue they merely provide tools and that employers bear responsibility for how those tools are used.
Mobley's argument inverts this. He contends that Workday's AI does not just provide tools - it makes decisions. The system scores and ranks applicants, generates recommendations, and in many cases determines which candidates an employer ever sees. If the system systematically filters out older applicants before a human hiring manager reviews any applications, the discrimination happens at the software level, not the employer level.
Judge Lin's rulings have accepted this framing as at least plausible at the current stage of litigation. The court has not ruled that Workday's software actually discriminates - that is what the trial will determine. But it has ruled that the legal theory is viable: a software vendor whose product makes or heavily influences employment decisions can be held to the same anti-discrimination standards as an employer.
Customer disclosure
In a later ruling, the court ordered Workday to disclose its customer list - specifically, which employers used HiredScore, an AI recruiting tool that Workday acquired and integrated into its platform. This disclosure is significant because it identifies which employers' applicants may be members of the class and potentially exposes those employers to their own future litigation.
From Workday's perspective, revealing its customer list to opposing counsel in a discrimination case is commercially sensitive. From the plaintiffs' perspective, the customer list is essential for identifying class members and issuing notice. The court sided with the plaintiffs.
What the case means
Mobley v. Workday is the first federal collective action to treat an AI hiring vendor as an employment agent under the ADEA. That alone gives it outsized importance. If the case succeeds at trial, it establishes that companies building AI recruitment tools bear direct legal responsibility for discriminatory outcomes produced by those tools, regardless of whether the company using the tool is the one making the final hiring decision.
The case also tests whether disparate impact analysis - the legal framework for identifying discrimination through statistical patterns rather than intentional bias - applies to AI systems at the vendor level. AI screening tools are trained on historical hiring data, which often reflects existing patterns of discrimination. If a model learns from data where younger candidates were historically preferred, it may replicate that preference in its recommendations without anyone programming it to do so. The question is who bears legal responsibility for that learned bias.
For the more than 11,000 organizations using Workday, the case creates immediate uncertainty. If Workday's AI tools are found to discriminate, every employer that relied on those tools for hiring decisions faces potential exposure. Legal commentators have noted that while Mobley sued the vendor, employers are "likely next in line."
The case has prompted recommendations from employment law firms that companies audit their AI hiring vendors, test screening tools for potential bias, and document their oversight of automated hiring processes. The EEOC's amicus brief reinforces that federal enforcement agencies are treating algorithmic discrimination as a priority.
By February 2026, the court had authorized formal notice to potential class members, with a deadline to opt in by March 7, 2026. The class potentially includes millions of applicants aged 40 and over who applied for jobs through Workday's platform over a five-year period. The trial, when it comes, will determine whether the algorithm that processed 1.1 billion applications in a single year was quietly excluding older workers from jobs they were qualified to hold.
Discussion