Federal Court Finds Third-Party Artificial Intelligence Software Companies May be Liable for Discrimination in Employment Decisions

The offices or Workday, a human capital vendor that uses artificial intelligence.

Kelsey Speyer

On July 12, 2024, the United States District Court for the Northern District of California held that third-party software companies can be liable under federal anti-discrimination laws for discrimination in the workplace that is the result of artificial intelligence, rather than human decision-making. In Mobley v. Workday, Inc., the California federal court determined that software vendor Workday can be liable under Title VII, the Age Discrimination in Employment Act of 1967 (“ADEA”), and the ADA Amendments Act of 2008 (“ADA”) when employers delegated to Workday the traditional employment function of evaluating and screening job candidates, even though Workday used artificial intelligence—rather than humans—in making those decisions.

Workday is a software company that provides human resource services to employers, including the collection, processing, and screening of job applicants. Workday’s platform incorporates AI and machine learning to determine, without human involvement, whether a candidate’s application will be rejected or move on to the next steps of the recruiting process. Although the court did not discuss Workday’s algorithm in depth, it did acknowledge that Mr. Mobley’s allegations that Workday’s tools are trained on biased data were supported by significant literature about “bias in data models and algorithms.”

Derek Mobley, the plaintiff, is a Black man, over 40 years old, who suffers from anxiety and depression. Since 2017, Mr. Mobley has applied to over 100 positions with companies that used Workday’s applicant screening and evaluation tools. Despite being qualified for the jobs to which he had applied, Workday’s platformed screened Mobley out and rejected him from all 100 openings.

The court recognized that federal anti-discrimination laws “prohibit discrimination not just by employers themselves but also by agents of those employers”—those to whom an employer delegates tasks typically performed by an employer, such as hiring, firing, and discipline. The court emphasized that agency liability is a critical part of the remedial schemes of these laws; otherwise, discrimination could continue unabated simply because an employer delegates its duties to a third party. The court further noted that federal anti-discrimination statutes do not distinguish between humans versus automated tools (like Workday’s AI platform) when employers delegate their functions. When an employer delegates its “traditional functions” to a vendor, and that vendor’s conduct discriminates against an employee or applicant because of their protected status, there is a cognizable claim under federal law. Therefore, the court concluded that Mr. Mobley can proceed with his discrimination claims against Workday.

The court’s ruling in Mobley demonstrates the continued importance of federal anti-discrimination laws in the modern workplace.  The decision affirms that the central tenant of these anti-discrimination statutes—equal access to employment opportunities—remains intact, even as employers begin to use AI and outsource to AI vendors functions that were historically performed by humans and human resource departments. The Mobley court provided a clear roadmap for employees to hold employers and third-party agents liable for algorithmic discrimination in the workplace.

C
It takes courage to fight back against those who discriminate.
Contact us to see how we can help you.