California’s Privacy Regulator Had a Busy November, Automated Decisionmaking Edition: What Does It Mean for Businesses?

Sheppard Mullin Richter & Hampton LLP
Contact

Sheppard Mullin Richter & Hampton LLP

[co-author: James O'Reilly*]

In the second in our series of new CCPA regulations from California, we look at proposed rules for use of automated decisionmaking technology. As a reminder, CCPA discusses these technologies in relation to profiling, namely “any form of automated processing of personal information” to analyze or predict people’s work performance, health, and personal preferences, among other things.

The law had called on the California privacy agency (CPPA) to promulgate rules to give consumers the ability to opt out of the use of these technologies and get access to information about how the tools are used when making decisions about them. The first set of proposed rules were met with some concern, some of which has been addressed in this newest version. Highlights of the changes are below:

  • Narrowing the definition of “automated decisionmaking technology:” The law does not define this term, and in 2023 the agency had proposed that it be broadly any system that “in whole or in part” facilitates human decisionmaking. The term has now been narrowed to that which either replaces humans or substantially facilitates their decisionmaking. Meaning, that it is a “key factor” in the human’s decision. The rule gives an example: using a tool’s score as primary factor in making a significant decision about someone.
  • Automatic decisionmaking and risk assessments: As part of the new rules for risk assessments, the agency has included specific provisions on profiling. First, companies would need to conduct risk assessments themselves. Second, the proposed rule imposes obligations on entities that make automated decisionmaking or AI technologies available to others if it trains on personal information. In those cases, the company would need to give the other entities the information they need to conduct their own risk assessments. That information would need to be given in “plain language.”
  • Automated decisionmaking that results in a “significant decision:” If there will be a “significant decision” made, the rules contemplate a “pre-use” notice. This was also contemplated in the 2023 version of the rules. However, in the 2023 version, the obligation arose if there was a “legal or similarly significant” impact (the language of CCPA). Under the proposed rules, the agency discusses “significant decisions” impacting an individual. It gives examples, including education and employment opportunities. Also included are extensive profiling and training automated decisionmaking technology that might, among other things, identify someone or make a significant decision about them.
  • Changes to company privacy policies: The rule as revised would require companies to add into the privacy policy (in the rights section) that an individual can opt out of having their information used by automated decisionmaking that results in a “significant decision.” The policy also needs to explain how someone can access automated decisionmaking.

*James O’Reilly is a Cybersecurity and Privacy Fellow in the firm’s Chicago office.

Putting It Into Practice: The California privacy agency has addressed some of the concerns raised in the initial automated decisionmaking rules. However, the obligations continue to be expansive, and may impact many organizations’ uses of AI tools, especially in the HR space. That said, the obligations outlined in the rule should look familiar to those who already fall under NYC’s AI law.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Sheppard Mullin Richter & Hampton LLP

Written by:

Sheppard Mullin Richter & Hampton LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Sheppard Mullin Richter & Hampton LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide