On July 10, the attorney general from Massachusetts (AG) announced a settlement with a student lender for alleged unfair and deceptive acts and practices in violation of consumer protection and lending laws leading to, among other things, disparate impact from misusing AI tools to underwrite Massachusetts consumers’ loan applications, along with making “arbitrary human-based loan assessments” and employing a “‘Knock-Out Rule’ to automatically deny applications based on immigration status.”
The AG brought the case using the state’s code on unfair practices in its Consumer Protection Act (G.L. c. 93A). Specifically, the AG alleged the lender violated G.L. c. 93A, § 2 by failing to prevent disparate outcomes in underwriting Massachusetts consumers’ applications for credit in both its “Algorithmic Underwriting” and “Judgmental Underwriting” processes, as well as other fair lending violations. The lender denied the AG’s allegations and that it violated Massachusetts or federal law.
The AG ordered the student loan lender to pay $2,500,000, as well as develop governance systems to prevent any fair lending abuses and assess the risks of using AI models.
[View source.]