[co-authors: Stephanie Kozol*, Nick Gouverneur**]
As one of her last acts in office, on December 24, 2024, Oregon Attorney General (AG) Ellen Rosenblum issued guidance for businesses deploying artificial intelligence (AI) technologies. The guidance highlights the risks associated with the commercial use of AI, and underscores that, despite the absence of a specific AI law in Oregon, a company’s use of AI must still comply with existing laws.
Rosenblum’s Guidance
The guidance highlights the AG’s concerns with the unpredictability of AI outputs, noting that these can affect fairness, obscure accountability, and impact trustworthiness. Privacy and accountability are also emphasized as significant risks due to AI’s reliance on vast amounts of personal data. The guidance addresses bias and discrimination, noting that AI systems trained on data sets that include data reflecting bias may perpetuate social inequalities. The lack of transparency in AI decision-making processes can make it difficult for humans to identify, understand, and correct such decisions.
With the foregoing risks in mind, the guidance reminds businesses that AI is regulated under the Oregon Unlawful Trade Practices Act (OUTPA), the Oregon Consumer Privacy Act (OCPA), the Oregon Consumer Information Protection Act (OCIPA), and the Oregon Equality Act (OEA) and discusses the ways AI developers and deployers may violate these statutes.
First, the OUTPA prohibits misrepresentations in consumer transactions. The guidance reminds companies developing, selling, or deploying AI technology to ensure their tools provide accurate information to consumers, and notes that misrepresentations may be actionable even if not made directly to a consumer, or made to a consumer by AI. As a result, Rosenblum made clear that AI developers or deployers could be liable to downstream consumers for harms caused by their products. The guidance identifies various examples of OUTPA violations, such as misrepresenting the characteristics, uses, benefits, or qualities of AI products; using AI to falsely claim nonexistent sponsorships, approvals, or connections (such as artificial celebrity endorsements); or using AI-generated voices for robocalling campaigns.
Second, the OCPA guarantees a consumer’s right to control the distribution of their personal data. Per the AG’s guidance, such control is particularly relevant for generative AI systems trained with such data. Thus, Rosenblum suggests that AI developers should disclose whether personal data was used to train any particular AI model, and obtain consumers’ express consent to the usage if the data used is considered “sensitive data” under the act. Notably, the guidance emphasizes that affirmative consent for the use of Oregon consumer sensitive data is required and retroactive or passive attempts to modify privacy notices or terms of use will not bring a company into compliance. Consumers must also be able to opt out of AI profiling when AI is used to make significant decisions, such as those pertaining to housing, education, or lending.
Third, the guidance notes that the OCIPA requires AI developers who possess personal information to safeguard that information through reasonable cybersecurity measures and to comply with the provisions of the OCIPA.
Finally, the guidance highlights that the OEA prohibits discrimination based on protected classes and bars discrimination resulting from AI use, particularly in the context of housing and public accommodations.
Why It Matters
This latest guidance from Oregon aligns with similar commentary by other state AGs, including those in Texas and Massachusetts. Based on such announcements, it appears that AGs across the U.S. intend to apply existing consumer protection laws to the use and application of AI. The trend of using traditional enforcement actions to address the rapid proliferation of AI will likely continue in the absence of specific AI legislation, and demonstrates why AGs are equipped to rapidly adapt to a changing environment as legislators struggle to keep pace.
*Senior Government Relations Manager
**Associate