CFPB: Black-box modeling no cover for credit denials

Even if their credit models rely on complex algorithms, anti-discrimination law requires companies to explain the specific reasons for denying an application for credit or taking other adverse actions, the Consumer Financial Bureau said in a circular.

Even if their credit models rely on complex algorithms, anti-discrimination law requires companies to explain the specific reasons for denying an application for credit or taking other adverse actions, the Consumer Financial Bureau said in a circular published last week.

The Equal Credit Opportunity Act requires that companies provide applicants with adverse action notices that contain the specific and accurate reasons for that adverse action. 

Many financial institutions rely on complex and wind-ranging datasets gathered from consumers’ publicly available information in their credit decisions, including algorithm-based decisions. Some algorithms, however, function as a black box, with the internal reasoning behind outputs unknown to the models’ users, rendering compliance with ECOA’s adverse action notice requirements impossible.

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

The CFPB document reiterates that companies must meet ECOA requirements regardless of what technology they use in decisions. They “cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new,” the CFPB said in a press release.

Earlier this year, the CFPB outlined ways to prevent algorithm-driven bias in home valuations. 

“It is tempting to think that machines crunching numbers can take bias out of the equation, but they can’t,” Chopra said at the time. “This initiative is one of many steps we are taking to ensure that in-person and algorithmic appraisals are fairer and more accurate.”

Fredrikson & Byron Law