Takeaways
- The new regulations apply to all employers in California and pertain to any automated decision system — not just advanced “AI” tools, but also those using selection criteria for hiring, promotions or training.
- Employers are prohibited from using automated decision system (ADS) or criteria that result in discrimination based on protected categories under FEHA and must accommodate religious and disability needs.
- Employers should consider conducting bias audits of their ADS.
Related links
Article
California’s Civil Rights Department finalized regulations to curb the discriminatory impacts of artificial intelligence and automated decision-making in the workplace. The regulations apply to all employers in California and take effect on Oct. 1, 2025.
The regulations define an automated decision system (ADS) as any computational process that makes or assists in making employment decisions, such as hiring, promotions, selection for training programs, or similar activities. The regulations apply beyond “machine learning” artificial intelligence and cover systems that involve the use of “selection criteria.”
Among other uses, businesses may use regulated ADS to:
- Screen resumes for particular terms or patterns;
- Direct job advertisements or recruiting materials to targeted groups;
- Assess applicants’ or employees’ skills through questions, puzzles, games, or challenges; and
- Analyze audio or video recordings to evaluate, categorize, or recommend applicants or employees.
The regulations prohibit employers from using ADS or selection criteria that discriminate against applicants or employees based on protected categories defined under the Fair Employment and Housing Act (FEHA). Employers may also need to provide reasonable accommodations consistent with FEHA’s religious and disability protections.
The regulations emphasize the value of bias audits or other efforts to avoid unlawful discrimination. In discrimination cases, courts and agencies may consider the quality, scope, recency, results, and employer response to bias testing. The absence of such evidence may weigh against employers that choose not to evaluate their ADS.
The regulations also impose data collection and retention requirements:
- Employers must preserve ADS-related records, including dataset descriptors, scoring outputs, and audit findings, for four years.
- These records are essential for demonstrating compliance and responding to any regulatory or legal challenges.
Employers can consider the following compliance checklist:
- Audit AI tools used in employee screening, hiring, promotions, selection, and evaluation;
- Ask vendors for their anti-bias testing protocols, data-use practices, and confirm their understanding of ADS-related liability;
- While they are not required, consider bias testing routines with legal guidance, both pre- and post-deployment;
- Update recordkeeping policies to securely store ADS-related data for four years;
- Ensure human oversight over AI-facilitated decisions; and
- Train HR and management teams on new definitions and legal responsibilities under FEHA.