Recently, a federal court in the Northern District of California issued an important ruling in the closely followed Mobley v. Workday putative class action lawsuit alleging that Workday, a cloud-based software vendor specializing in financial management and human capital management, violated federal discrimination laws. In the lawsuit, the plaintiffs claim Workday’s AI hiring platform screens out applicants over age 40 in violation of the Age Discrimination in Employment Act (“ADEA”).
In this case, the plaintiff, along with four other individuals over forty, allege that Workday’s AI-based recommendation system systematically disadvantaged older job applicants by scoring, sorting, and screening out their applications. According to the plaintiffs, Workday’s tools embed biases from training data and employer preferences, thereby reducing opportunities for applicants in protected classes to advance past the initial screening stage. The plaintiffs allege they collectively submitted hundreds of applications through Workday’s platform and were consistently rejected. Interestingly, many plaintiffs received automated notices of rejection within minutes despite meeting minimum qualifications.
In the decision, the Court rejected plaintiffs’ theory that Workday acted as an “employment agency” under federal law because it did not procure employees for employers as defined in the statute. However, the Court accepted the plaintiffs’ “agent” theory of liability finding that the First Amended Complaint plausibly alleged that employers delegated traditional hiring functions to Workday. The Court noted that, “Workday does qualify as an agent because its tools are alleged to perform a traditional hiring function of rejecting candidates at the screening stage and recommending who to advance to subsequent stages, through the use of artificial intelligence and machine learning.”
Although Workday is the named defendant in this case, the decision highlights a key risk not before the court: the extent of employer liability for its reliance on or use of AI software vendor tools and programs. The impact of this case could be far ranging because over 11,000 employers use Workday and more than 1.1 billion applications were rejected using Workday’s software tools during the relevant time period. As discovery proceeds, the employers utilizing Workday’s AI software will likely be disclosed and could face potential liability.
Practical Takeaways for Employers
- Vendor Accountability: Using a third-party platform does not necessarily insulate an employer from discrimination claims. Employers remain responsible for ensuring compliance with state and federal laws, although vendors may also face direct liability. Careful attention should be paid to all agreements with the vendor.
- Audit Hiring Tools: Review the AI or algorithmic tools your vendors use. Be sure to understand how applicants are scored, ranked, or filtered.
- Maintain Human Oversight: Ensure automated recommendations do not replace meaningful human oversight and review in all hiring decisions.
This case now moves forward with additional fact-finding. While its outcome remains uncertain, it underscores that AI-driven recruiting is on the radar of the courts.