California is the latest mover in a world of evolving AI regulation, amending the Fair Employment and Housing Act’s (FEHA) regulatory framework to address the use of artificial intelligence in employment-related decisions. The amended regulations take effect on Oct. 1, 2025, so now is the time to understand what the regulations do and don’t require.
TLDR: The Regulations in a Nutshell
While the FEHA and its implementing regulations already prohibited discrimination based on protected characteristics — including discrimination effectuated by artificial intelligence — the amendments now explicitly state that existing anti-discrimination protections apply to discrimination occurring through the use of an Automated Decision System (ADS).
An ADS is defined as a computational process that “makes a decision” or “facilitates human decision making” regarding an employment benefit. The definition of “employment benefit” is broad and includes, for example, hiring, promotions and selection for training programs.
The regulations provide examples of what may be considered an ADS, including:
- Using computer-based assessments or tests to screen, evaluate, categorize or recommend applicants for employment.
- Directing job advertisements or recruiting materials to targeted groups.
- Screening resumes for particular terms or patterns.
- Analyzing employee or applicant data acquired from third parties.
Do the Regulations Apply to Me?
The regulations apply to any “covered entity,” including employers, employment agencies and labor organizations. While the definition of “employer” has always included any “agents” of the employer, for the first time, the regulations explicitly define what it means to be an “agent.”
An agent is defined as one who “exercise[s] a function traditionally exercised by the employer or any other FEHA-regulated activity, which may include applicant recruitment, applicant screening, hiring, promotion or decisions regarding pay, benefits or leave, including when such activities and decisions are conducted in whole or in part through the use of an automated decision system.”
While the agency concept may seem novel, the regulation’s definition largely adopts the definition established by the California Supreme Court in Raines v. U.S. Healthworks, 15 Cal. 5th 268 (2023). Accordingly, while the regulations codify this definition of “agent,” they likely do not materially expand the FEHA’s reach to previously unregulated entities.
Do the Regulations Require Bias Testing?
No, the regulations do not require bias testing. However, if a covered entity faces a discrimination claim related to its use of an ADS, the regulations provide that evidence (or lack of evidence) of anti-bias testing or similar proactive efforts to avoid unlawful discrimination is relevant to assessing the validity of the claim. So, while not required, bias testing may be helpful for employers and other covered entities seeking to defend against a discrimination claim.
Keep in mind, however, that the regulations do not define how bias testing should be conducted. As with other types of audits conducted to assess legal risk or determine legal compliance — for example, pay equity studies — employers should consult with counsel before undertaking any such efforts.
What About Recordkeeping Requirements?
The regulations also create new recordkeeping requirements. Specifically, covered entities must preserve:
- Any data used in or resulting from the application of an ADS, such as data provided by or about individual applicants or employees, or data reflecting employment decisions or outcomes.
- Any data used to develop or customize an ADS for use by a particular employer or other covered entity.
The data must be retained for four years.
Anything Else That I Should Know?
How the regulations will be interpreted and enforced remains to be seen. By their plain language, the regulations do not materially expand a covered entity’s obligation not to discriminate under California law. Instead, they codify existing judicial interpretation of the FEHA while imposing new recordkeeping requirements.
Nonetheless, employers and other covered entities should expect that government enforcement agencies and plaintiffs alike will be watching closely and ready to challenge any alleged AI-related discrimination. Because of that, it is more important than ever to ensure your automated decision systems comply with the law.
[View source.]