QUESTION
We have a question that involves appraisals and AVMs. Our concern is about the new proposals coming out of the CFPB about computer models influencing home valuations. As we understand it, their view is that these models could cause fair lending violations.
We’ve read whatever we could about this proposal. However, it still is a bit confusing because computer models are simply not something that we know anything about. It makes us feel that this is a potential blind spot that could expose us to fair lending risk.
It would be great if you’d explain the CFPB’s proposal in layman’s terms.
How does the CFPB’s proposal about AVMs and computer models potentially cause fair lending violations?
ANSWER
I know how you feel. Sometimes it seems like you have to be a computer wiz to stay current with the latest and greatest digital improvements used to originate loans. I’ll explain the CFPB’s proposal. Hopefully, it will give you a better understanding.
The proposal involves potential requirements to prevent “algorithmic” bias in home valuations. The word “algorithmic” just means a set of concise rules that must be followed, for instance, in doing calculations. It has come to be associated with computer science, but it’s actually a term used in many disciplines.
Let’s refer, then, to the proposal as involving “Algorithmic Bias,” which, in fact, is the CFPB’s terminology. The Bureau's proposal is entitled Consumer Financial Protection Bureau Outlines Options to Prevent Algorithmic Bias in Home Valuations.[i]
Categorically, the proposal involves compliance management, ECOA (Regulation B), fair Lending, Fintech (Financial Technology), and Real Estate Appraisals. You can comment on the proposal at the CFPB. All potentially affected entities will have the opportunity to comment once these new AVM rules are proposed.
In essence, the CFPB announced an initiative to ensure that computer models used to help determine home valuations are accurate and fair. Thus, the Bureau outlined the options it is considering in connection with future rulemaking on quality control standards for automated valuation models (AVMs).
First, some background and then an explication of where this all goes.
According to the CFPB,[ii]
“When underwriting a mortgage, lenders typically require an appraisal, which is an estimate of the home’s value. While traditional appraisals are conducted in person, many lenders also employ algorithmic computer models. These models use massive amounts of data drawn from many sources to value homes. The technical term for these models is automated valuation models. Both in-person and algorithmic appraisals appear to be susceptible to bias and inaccuracy, absent appropriate safeguards.”
The CFPB claims that
“AVMs can pose fair lending risks to homebuyers and homeowners. [It] is particularly concerned that without proper safeguards, flawed versions of these models could digitally redline certain neighborhoods and further embed and perpetuate historical lending, wealth, and home value disparities.”
This claim leads the CFPB to conclude that “computer models and algorithms…[used in] AVMs can pose fair lending risks to homebuyers and homeowners.”
The CFPB’s oversight of these computer models is multifold, including:
· Ensuring a high level of confidence in the estimates produced by automated valuation models;
· Protecting against the manipulation of data;
· Seeking to avoid conflicts of interest;
· Requiring random sample testing and reviews; and
· Accounting for any other such factor that the agencies determine to be appropriate.
Where is this all going?
It is going to the Fifth Factor.
The CFPB is considering including an AVM nondiscrimination quality control factor, referred to as the “Fifth Factor.” Under this option, entities would be required to adopt policies and procedures specifically designed to mitigate fair lending risk in the use of AVMs. This would be an obligation independent of the preexisting obligation to comply with federal nondiscrimination requirements.
There are two alternative compliance approaches the CFPB is considering.
Under the first approach, entities would have the flexibility to design the relevant, fair lending policies, practices, and control systems in a manner that is tailored to their business models and commensurate with the institution’s risk exposures, size, and business activities.
Under the second approach, the CFPB is considering whether compliance with applicable nondiscrimination laws for AVMs is already encompassed within the first Four Factors; specifically, the factors requiring:
Factor 1: A high level of confidence in the estimates produced by AVMs;
Factor 2: Protection against the manipulation of data;
Factor 3: Avoidance of conflicts of interest; and
Factor 4: Random sample testing and reviews.
Jonathan Foxx, Ph.D., MBAChairman & Managing Director
[i] Consumer
Financial Protection Bureau Outlines Options to Prevent Algorithmic Bias in
Home Valuations, Consumer Financial Protection Bureau, Newsroom, February 23,
2022 https://www.consumerfinance.gov/about-us/newsroom/cfpb-outlines-options-to-prevent-algorithmic-bias-in-home-valuations
[ii] Idem, this and
following quotes.