The rules are focused on three key matters: (1) the use of ADMT to make a significant decision affecting consumers, with increased transparency and new consumer and worker rights; (2) cybersecurity audits and risk assessments; and (3) greater accountability through expanded reporting requirements to the CPPA. The regulations also revise and attempt to provide clarity to certain terms, such as multifactor authentication, automated decisionmaking, privileged accounts, and sensitive personal information, and train the spotlight on nuanced areas, such as employment and insurance.
The rulemaking package will now be submitted to the California Office of Administrative Law (OAL). The OAL has 30 business days to review and approve the regulations. If submitted before August 31, 2025, and subsequently approved, the regulations are expected to take effect on October 1, 2025. Implementation deadlines for certain key provisions are expected to follow a later timeline as noted below.
Automated Decisionmaking Technology (ADMT)
Compliance deadline: January 1, 2027
The rules aim to increase transparency and accountability in the use of algorithms and automated systems if they are used to make certain significant decisions affecting consumers, which in CCPA land includes workers (both employees and independent contractors) and applicants.
What is ADMT?
A key concern is that, instead of providing a standalone definition for ADMT, the CPPA brings into scope any technology that “processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking.” Essentially, any technology used to make a business decision about consumers without human involvement could potentially fall within the ambit of the ADMT definition, and this raises the question of whether, in addition to typical artificial intelligence technologies, atypical technologies, such as algorithms, spreadsheets, and even calculators could be considered ADMT if they effectively replace human decisionmaking and lead to important outcomes about consumers.
The ADMT rules also apply to profiling activities and the use of personal information to train or improve ADMT technology, further expanding the scope and covering technology that may still be in development.
What Constitutes a “Significant Decision”?
The term “significant decision” has been a focal point of debate throughout the rulemaking process. The final regulation defines a significant decision as one that materially affects a consumer’s access to, or the terms of, critical life opportunities or services, including:
- financial or lending services;
- housing;
- employment
- educational opportunities;
- or contracting opportunities or compensation; and
- healthcare services.
This definition reflects concerns raised during public comment periods, particularly from labor and consumer advocacy groups, about the real-world consequences of opaque or biased automated decisions.
Consumer Rights
The new rules also introduce extensive consumer rights (including opt-out rights and the right to appeal significant decisions and broader access rights) with respect to ADMT, subject to certain limited exceptions.
Businesses using ADMT will need, subject to some exceptions, to provide consumers with a “Pre-Use Notice.” This notice requires businesses to disclose their use of ADMT, to clearly explain the purpose of the ADMT, how it functions (including key factors affecting its output), and to inform consumers of their rights to opt-out and access information about the ADMT. The notice needs to be easy-to-read, accessible, and provided before the ADMT processes the consumer's personal information. This Pre-Use notice can be included in the privacy policy. If the data has already been collected for a certain process and a company then decides to use the data for a different purpose, a new Pre-Use notice or privacy policy must be provided with the requisite information.
Businesses are also expected to provide consumers, upon request, with information about how ADMT was or currently is used concerning them, including purpose and outputs, and how those outputs were or will be used in decisionmaking. Notably, the rules direct the provision of plain language explanation of the specific purpose for which ADMT was used, explicitly prohibiting vague justifications like “to improve our services.” More critically, businesses must disclose “information about the logic” of the ADMT, a requirement that is both broad and technically complex. The ambiguity around what constitutes sufficient “logic” disclosure may lead to inconsistent interpretations and increased regulatory scrutiny.
On the litigation front, this could also lead to disclosure of proprietary information that would otherwise only be available through discovery, particularly in the employment space, where discrimination claims run rampant. Further, it may put the CPPA at odds with the current administration’s “AI Action Plan,” which contemplates that any state rulemaking that “overregulates” AI could lead the Administration to withhold federal funding, putting pressure on states like California to weigh the potential consequences of federal defunding against the merits of their AI regulations.
Further, businesses are required to compile the number of ADMT-related requests they received in a calendar year and the subsequent action taken.
These requirements are likely to pose significant operational challenges, as businesses have to integrate these consumer rights and business obligations into the development and deployment of ADMT.
Risk Assessment: Going Beyond ADMT
Compliance Deadline: December 31, 2027, to complete assessment; April 1, 2028, to submit assessment report to CPPA
The existing regulations require that businesses must conduct and document risk assessments for data processing activities that present a significant risk to consumer privacy. The expectation has always been that the use of ADMT would trigger risk assessment requirements. However, the new regulations blow open that lid by requiring that ADMT and the use of any meaningful automated processing that is used to monitor workers or to infer consumer characteristics based on location, whether or not it meets the definition of ADMT, could require risk assessment. This creates a possible maze for employers and industries that use location data to draw consumer inferences.
As noted above, with respect to ADMT, the risk assessment requirement applies not only when ADMT is actively deployed to make significant decisions, but also when a business processes personal information to train ADMT for such purposes. The obligation extends to the training of facial-recognition, emotion-recognition, identity verification, or profiling technologies. Importantly, this duty is not limited to current use. A business must conduct risk assessments even if it intends to use the technology in the future, allow others to use the technology (e.g., through licensing or partnerships), or market the technology for use by others.
Additionally, businesses must ensure that service providers and contractors using ADMT on their behalf also comply with these regulatory requirements.
Adding to the quagmire, the rules provide that employees whose job duties include participating in the processing of personal information that would be subject to a risk assessment must be included in the business’s relevant risk assessment process. For businesses with larger and more complex operations, this could turn out to be a significant stumbling block.
Vendors and other third parties that make ADMT available to another business may also be required to provide the business with the necessary facts to facilitate the business’ risk assessment, leading to the inherent conflict between compliance and disclosure of information between parties who may also be competitors.
The regulations envision that businesses will submit a risk-assessment report to the CPPA. The report should include certain key pieces of information, including the number of risk assessments performed, whether the risk assessment involved the processing of certain personal information, an attestation, and certain contact information. Reports for risk assessments conducted in 2026 and 2027 will be submitted to the CPPA by April 1, 2028. Reports for risk assessments after 2027 must be submitted by April 1 of the following year. Risk assessments must be reviewed and updated at least once every three years.
For processing activities already underway before the regulations' effective date, businesses have until December 31, 2027, to complete their assessments. Material changes to a process requiring a risk assessment necessitate an update within 45 days. Businesses must also provide contact points, the time period covered, and the number of risk assessments conducted when submitting information to the CPPA.
Cybersecurity Audits
Compliance deadline: April 1, 2030 (or as early as April 1, 2028, for businesses that have $100 million or more in annual revenue, and April 1, 2029, for businesses that have $50 million or more in annual revenue)
The regulations impose significant obligations on businesses to evaluate and document their data protection practices. Importantly, they introduce greater reporting requirements, and businesses have to submit certain documentation to the CPPA, as opposed to merely retaining records for their own use.
Businesses whose processing of consumers' personal information presents a "significant risk to consumers' security" are required to complete cybersecurity audits. The first audit report must be completed by April 1, 2030. However, businesses with a gross annual revenue greater than $50 million may have earlier deadlines (as early as 2028).
Importantly, the regulations require that, by no later than April 1, following any year a business is required to complete a cybersecurity audit and audit report, the business must submit a certification of compliance to the CPPA. The certification must be completed by a member of the executive management team with the requisite authority and knowledge and must include an attestation of compliance with audit requirements.
Per the regulations, cybersecurity audit reports must include, among others:
- a description of the policies, procedures, and practices assessed;
- the criteria used in the audit; and
- the evidence examined, including documentation, testing, and interviews.
The regulations also tighten standards concerning who can audit by requiring qualified, objective, and independent professionals, using standards accepted in the auditing profession. The audit must comply with procedures and standards provided or adopted by the American Institute of Certified Public Accountants, the Public Company Accountability Oversight Board, the Information Systems Audit and Control Association, or the International Organization for Standardization. The rules also provide instructions on safeguards that must be in place if an internal auditor is used.
Businesses and auditors are required to retain audit records and all supporting documents under the new rules. This marks a shift in accountability as, in the past, businesses may have tasked auditors with record retention.
The regulations direct service providers to assist the business with completion of its cybersecurity audit obligations.
As a reminder, a “significant risk to consumers’ security” (i.e., the trigger for the audit requirements) is found if:
- the business derived 50% or more of its annual revenue from selling or sharing consumers’ personal information in the preceding calendar year; or
- the business had annual gross revenues in excess of $25,000,000 in the preceding calendar year; and
- processed the personal information of 250,000 or more consumers or households in the preceding calendar year; or
- processed the sensitive personal information of 50,000 or more consumers in the preceding calendar year.
Notice and Consent
Consent for selling and sharing personal information is not new to the CCPA. However, the new rules delve into and highlight the importance of meaningful consent. For example, the regulations clarify that where the option to participate in a financial incentive program is selected by default or featured more prominently (e.g., larger in size or in a more eye-catching color) than the choice not to participate in the program would call consent into question. Similarly, choices driven by a false sense of urgency (e.g., “time is running out to consent to this data use and receive a limited discount”) could be seen as misleading.
With respect to sensitive information, the regulations highlight the importance of notice, particularly in connection with data collected through connected devices such a smart watch, augmented or virtual reality, and/or data collected offline (as such as in a brick-and-mortar store).
The regulations also clarify that, while there are exceptions regarding a consumer’s right to limit use of sensitive personal information, those exceptions may come with their own parameters. For example, the right to limit is not intended to interfere with a business’s ability to collect and use the biometric information of its employees to authenticate them for access into secured areas of their business and to prevent access by unauthorized persons. However, the business would not be able to retain the biometric information indefinitely or use it for unrelated purposes, such as the development of commercial products.
Insurance
Despite concerns from industry stakeholders regarding clarity and feasibility, the CPPA made no substantive changes to the insurance-related provisions in Section 12 of the final rules. The rules clarify that the CCPA only applies to personal information collected by insurance companies if the information is not subject to the California Insurance Code. This includes information collected from website visitors not applying for insurance, or employee information, which is not covered by existing insurance regulations. However, the CPPA did not clarify that all current exemptions in the CCPA continue to apply to insurance companies. Nonetheless, the CPPA has acknowledged that the new rules cannot supersede the text of the CCPA, whose exemptions will ultimately govern any interpretation of the provisions in Section 12.
Next Steps
As California organizations try to wrap their brains around the full expanse of the rules, and prepare for the future deadlines, it is important to ensure that, as a preliminary matter, companies and employers understand their use cases, data maps, and data collection points; the automated technologies that they or a vendor, service provider, or contractor may use; ensure that recordkeeping practices and retention schedules have been updated, revisit their cybersecurity audit, and risk assessment frameworks; update their vendor commercial agreements (particularly if they support automated decision making); review privacy policy updates and internal policies; and confirm that systems are set up to support the new consumer rights. Further, with the increased access requirements, employers should consider training human resources personnel and individuals tasked with responding to rights requests so they are aware of potential employment claims that could arise from disclosure.