California Adopts New Employment AI Regulations Effective October 1, 2025

Mayer Brown
Contact

Mayer Brown

The California Civil Rights Council (CRC) recently announced that it has finalized regulations that clarify how California’s anti-discrimination laws apply to the use of artificial intelligence (AI) and automated decision systems (ADSs) in employment decision-making (the “Regulations”). The Regulations provide that the use of an ADS (including AI) in making employment decisions can violate California law if such tools discriminate against employees or applicants – either directly or due to disparate impact – on the basis of protected characteristics (including race, age, religious creed, national origin, gender, and disability).

LEGAL FRAMEWORK

Effective on October 1, 2025, the Regulations amend the existing regulatory framework applicable to the California Fair Employment and Housing Act (FEHA) and will apply to all employers in California that use “artificial intelligence, machine-learning, algorithms, statistics, and/or other data processing” to facilitate human decision-making with respect to the recruitment, hiring, and promotion of job applicants or employees. In announcing the issuance of the final regulations, the Civil Rights Council explained that, while “these tools can bring myriad benefits, they can also exacerbate existing biases and contribute to discriminatory outcomes.” As a result, the regulations aim to:

  • Make it clear that the use of an ADS may violate California law if it harms applicants or employees based on protected characteristics, such as gender, race, or disability.
  • Ensure employers and covered entities maintain employment records, including automated decision data, for a minimum of four years.
  • Affirm that ADS assessments, including tests, questions, or puzzle games that elicit information about a disability, may constitute an unlawful medical inquiry.
  • Add definitions for key terms used in the regulations, such as “automated-decision system” and “proxy.”

DEFINITION OF AUTOMATIC DECISION SYSTEM AND ARTIFICIAL INTELLIGENCE

The regulations broadly define an ADS as any “computational process that makes a decision or facilitates human decision making regarding an employment benefit” that “may be derived from and/or use artificial intelligence, machine-learning, algorithms, statistics, and/or other data processing techniques.” 2 Cal. Code Regs. § 11008.1(a). The definition of ADS expressly includes AI, which is also defined broadly in the regulations to include “[a] machine-based system that infers, from the input it receives, how to generate outputs,” which can include “predications, content, recommendations or decisions.” § 11008.1(c).

The regulations provide illustrative examples of the types of tasks ADS perform, including:

  • Using computer-based assessments or tests, such as questions, puzzles, games, or other challenges, to (i) make predicative assessments about an applicant or employee;(ii) measure an applicant’s or employee’s skills, dexterity, reaction time, and/or other abilities or characteristics; (iii) measure an applicant’s or employee’s personality trait, aptitude, attitude, and/or “cultural fit;” and/or (iv) screen, evaluate, categorize, and/or recommend applicants or employees;
  • Directing job advertisements or other recruiting materials to targeted groups;
  • Screening resumes for particular terms or patterns;
  • Analyzing facial expressions, word choice, and/or voice in online interviews; or
  • Analyzing employee or applicate data acquired from third parties.

PROHIBITED DISCRIMINATION DUE TO USE OF AN ADS AND REASONABLE ACCOMMODATIONS

The regulations prohibit employers from using an ADS or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against applicants or employees based on protected categories defined under the FEHA. § 11009(f). The term “proxy” is newly defined as “[a] characteristic or category closely correlated with” a protected category under FEHA. § 11008(l). The new regulations further make clear that the use of facially neutral ADS selection tools that have an “adverse impact” on applicants or employees based on a protected characteristic are impermissible under FEHA unless the employer or covered entity can show that the selectin practice is “job-related and consistent with business necessity.” § 11017(e). Thus, as with other selection criteria used in making employment decisions, even if an employer does not intentionally use an ADS (including AI) to discriminate among applicants or employees, it can be liable for violating FEHA if the use of the ADS creates disparate impact.

The regulations caution that the use of an ADS that, for example, measures an applicant’s skill, dexterity, reaction time, and/or other abilities or characteristics may discriminate against individuals with certain disability or other protected characteristics. § 11016(c)(5). Similarly, an ADS that, for example, analyzes an applicant’s tone of voice, facial expressions, or other physical characteristics or behavior may discriminate against employees or applicants based on race, national origin, gender, disability, or other protected characteristics. § 11016(d)(1). Accordingly, to avoid unlawful discrimination, employers may need to provide reasonable accommodation to an applicant or employee consistent with the FEHA’s religious creed and disability protections. §§ 11016(c)(5) and (d)(1).

ANTI-BIAS TESTING AS AFFIRMATIVE DEFENSE

The Regulations provide that, to defend against a discrimination claim based on the use of an ADS, employers can show that they performed “anti-bias testing or similar proactive efforts to avoid unlawful discrimination” prior to and after adopting an ADS. § 11009(f). The regulations identify six relevant aspects of such testing, including the quality, efficacy, recency, and scope of such testing, as well as the results of the testing or other due diligence and the employer’s response to the results (e.g., whether and how the employer responded to the results).

EXTENDED RECORD RETENTION REQUIREMENTS

The regulations were also amended to require employers and covered entities to preserve personnel and other employment records subject to the following requirements:

  • Records must be retained for at least four years from the later of (a) the date the record was made, or (b) the date of the personnel action – an increase from the previous requirement to preserve records for two years;
  • Records subject to this requirement include selection criteria, automated decision system data, applications, personnel records, membership records, employment referral records, and other records “created or received by the employer or other covered entity dealing with any employment practice and affecting any employment benefit of any applicant or employee;” and
  • “Automated-decision system data” includes (a) any data used in or resulting from the application of an ADS, such as data provided by or about individual applicants or employees, or data reflecting employment decision or outcomes, and/or (b) any data used to develop or customize an ADS for use by a particular employer or other covered entity.

THIRD-PARTY LIABILITY FOR “AGENTS”

The regulations extend liability for ADS-driven discrimination to an employer’s “agent,” which is defined as anyone “acting on behalf of an employer, directly or indirectly, to exercise a function traditionally exercised by the employer or any other FEHA-regulated activity,” such as applicant recruitment, screening and hiring, promotion, or decisions regarding pay, benefits, or leave, “including when such activities and decisions are conducted in whole or in part through the use of an automated decision system.”

KEY TAKEAWAYS

Employers that use ADS systems in making employment decisions and AI vendors whose products are used in the employment arena should take concrete steps to prepare for the implementation of the new Regulations on October 1, including:

  • Identifying the ADS systems used in employment decision-making;
  • Reviewing and amending record retention policies to ensure records are retained for at least four years;
  • Performing anti-bias testing, establishing a plan outlining the frequency and nature of such testing, and documenting the testing process/criteria, results, and steps taken to address results, as appropriate; and
  • Updating anti-discrimination and reasonable accommodation policies to address the use of ADS systems.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Mayer Brown

Written by:

Mayer Brown
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Mayer Brown on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide