Four ways we can inject real racial equity In mortgage lending

Teddy Flo
December 09, 2021

About a month ago the Federal Housing Finance Agency (FHFA) announced that the two biggest players in the mortgage lending industry, Fannie Mae and Freddie Mac, will have to submit three-year plans to identify and address barriers to sustainable housing opportunities and advance equity in housing finance. As part of the planning process, the FHFA issued a call for comment to inform these long-term plans.

At Zest AI, our mission is to make fair and transparent credit available to everyone, especially when it comes to mortgages — which are so crucial to building generational wealth. We jumped at the chance to inform the government’s thinking on the issue. Here’s our comment letter outlining four meaningful actions the FHFA and GSEs (Fannie and Freddie) can take today to bring some needed balance to an historically high racial homeownership gap.

_____________________________________

October 25, 2021


To the Office of the Director, Federal Housing Finance Agency

Re: Input On Enterprise Equitable Housing Finance Plans

Zest AI appreciates the opportunity to comment on the Federal Housing Finance Authority’s request for input on the Enterprises’ equitable housing finance plans. Bold, comprehensive programs to restore justice to the housing finance market are long overdue. However, sadly and shockingly, America’s racial homeownership gap is bigger today than it was in 1960 — a time when it was legal to refuse to sell someone a home because of their race. How do we explain this lack of progress to our children and grandchildren?

In our opinion, the Enterprises’ equitable housing finance plans need to start by addressing the inadequacy of the current tools and techniques to underwrite mortgages. We applaud the FHFA for identifying as one of the plans’ core objectives to reduce racial or ethnic disparities in acceptance rates for the Enterprise’s automated underwriting system (AUS) and reducing racial or ethnic disparities in the share of loans acquired by the Enterprise compared to the overall mortgage market. We think those two objectives are linked: If the Enterprises change the technology that powers their automated underwriting systems, they can dramatically shrink the racial approvals gap and the share of loans acquired by the Enterprises.

Since Zest AI started in 2009, we’ve been out to make fair and transparent credit available to everyone. Today, dozens of banks, credit unions, fintechs, and other financial institutions use our software to build, adopt, and deploy powerful AI-based credit models swiftly and efficiently. We believe that AI-driven underwriting, when used properly, offers the Enterprises a once-in-a-generation opportunity to reduce racial disparities in mortgage lending while at the same time improving their safety and soundness.

For instance, last year, Zest entered a partnership with Freddie Mac to make the dream of homeownership a reality for tens of thousands of minority borrowers in the coming years. The results of our work with Freddie Mac — repeated and validated dozens of times over with other lenders — show that ML-based models can consistently produce more accurate and less discriminatory credit outcomes for consumers. For example, one Zest AI client saw approval rates for women jump by 20 percent after using a Zest AI-powered model. Another generated a model that shrank the approval rate gap between Black and white borrowers by 50 percent.

We explored AI’s positive impact on fair lending in greater detail in our December 2020 comment letter to the CFPB’s Request for Information on ECOA, our June 2021 Response to the Interagency Request for Information on Financial Institutions’ Use of AI, and our comment letter to the FHFA’s policy statement on fair lending in September 2021. The following are four meaningful actions drawn from those letters but adapted specifically for use in informing the Enterprise’s plans to deliver more equity in housing finance. 

Meaningful action no. 1: Make explicit that the Enterprises should search for and adopt less discriminatory alternative models and scores

Despite decades of focused effort by banks, legislators, and regulators, America’s consumer lending industry still struggles to develop predictive models that minimize disparate impact. Why? Because the data and the way they are used are flawed. Tens of millions of Americans lack sufficient credit history to compute a traditional credit score. Millions more have artificially depressed scores due to the way credit is scored today.

Generations of systemic racism and bias are baked into the data and traditional credit scores. And legacy players have been unable, and unmotivated, to remove it. As a result, only a fifth of Black households have credit scores over 700, compared to half of all white households. This leads lenders to deny mortgages for Black applicants at a rate 80 percent higher than that of white applicants. The share of Black households with a mortgage would increase nearly 11 percentage points if their credit score distribution was the same as the distribution for white households.

The most effective way to close homeownership disparities is to ensure that the Enterprises are using the fairest models they can to make decisions. Software-based techniques now exist to modify underwriting models so that they cause less disparate impact with only minimal impact on accuracy. Zest AI’s patented fair lending tools, for example, are already in use by dozens of lenders — including Freddie Mac — to optimize underwriting models for accuracy and fairness.

A central element of the GSEs’ equity plans should be the mandate to do more rigorous searches for less discriminatory alternative models. Under disparate impact discrimination analysis, an ECOA violation may exist if a creditor uses a model that unnecessarily causes discriminatory effects. Thus, if a lender’s model causes disparities and a less discriminatory alternative (LDA) model exists, the lender must adopt the alternative model.

Responsible lenders already perform this type of testing but, unfortunately, they often fail to use the most effective techniques available to do so, leading them to settle for second-best models when it comes to fairness.

Zest AI, however, has built automated LDA search into our software to make it easier for lenders to identify models optimized for accuracy and fairness. In model after model, Zest’s automated LDA search identifies alternatives that shrink approval rate gaps for protected classes while maintaining model performance. We see these improvements over models originally optimized only for performance and traditional, FICO-based scoring models. Zest AI’s techniques more exhaustively explore the space of potential models, making use of the massive amounts of computing resources now available, to find models that are fairer for consumers but that still maintain underwriting accuracy and profitability for lenders.

The FHFA should expect the Enterprises’ equitable finance plans to improve on and enhance their LDA searches. In addition, the FHFA should make it clear that third-party model developers must do the same. The FHFA’s Credit Score Validation Rule (and the accompanying Joint Enterprise Credit Score Solicitation) already require third-party model applicants to certify to fair lending compliance, but neither the FHFA Rule nor the Enterprise Solicitation explicitly states that third-party model developers or the Enterprises are expected to look for and adopt LDAs to these third-party models, even though it would be a violation of the federal fair lending laws to use a model if such alternatives existed. Consistent with the principles articulated in its Policy Statement on Fair Lending, the FHFA should clarify that it expects the Enterprises to require or themselves conduct such analyses. That expectation would move the Enterprises a significant step toward fulfilling the promise of their equitable finance plans.

Meaningful action no. 2: Encourage the GSEs to study and approve true alternatives to generic credit scores

When Congress passed the Economic Growth, Regulatory Relief, and Consumer Protection Act of 2018, the law’s Section 310 required that if the GSEs condition the purchase of a mortgage on a borrower’s credit score, that credit score model must have been validated and approved by the GSE. In August 2019, the FHFA issued a rule establishing standards for GSE validation and approval. The GSEs then published a description of their validation and approval processes, consistent with the FHFA’s rule.

Unfortunately, the process that the FHFA and GSEs used has effectively prohibited consideration of alternative credit scores or scoring systems unless they were developed by a handful of legacy score providers. Among the many requirements in the rule that restrict competition and innovation, simply applying to both GSEs required an upfront fee of $400,000 in addition to a slew of other fees that could add hundreds of thousands of dollars in additional costs. Few innovators could afford to apply.

And so, as one might have expected, in November 2020, the GSEs approved the Classic FICO credit score for continued use, while other scores remain under consideration and certain entities were discouraged from applying at all. These FHFA and GSE requirements do not address repeated concerns, including among consumer advocates, that the FHFA and GSEs unreasonably privilege Classic FICO, despite the fact that Classic FICO unnecessarily restricts access to credit, especially for the millions of Americans that do not have credit histories or are unscorable because their histories are too thin.

In the interest of ensuring that some of the most consequential uses of model scores in consumer financial markets are not unnecessarily excluding people from access to safe credit, the FHFA and GSEs should, as part of their equitable housing finance plans, level the playing field for new credit scoring models that can increase access to safe credit for the millions of consumers locked out by FICO. Among other things, the GSEs should test (and, if needed, pay for) alternative solutions alongside FICO, to evaluate portfolio performance and fairness, including evidence that no sufficiently performing less-discriminatory alternative model exists.

Meaningful action no. 3: Improve existing race estimation techniques so lenders and regulators can tell whether credit models are fair to people of color

Advancing equity in housing finance will require improving race estimation as performed by the GSEs in their fair lending analysis. Monitoring racial gaps in outcomes in mortgage lending is easier than in most consumer loan products thanks to the requirement that lenders collect applicants’ race and ethnicity information. But race/ethnicity data isn’t always clean and goes missing on 30 percent of applications. That forces lenders to estimate applicants’ races when conducting fair lending analysis. Currently, the CFPB and many lenders use a method called Bayesian Improved Surname Geocoding (“BISG”) to do so. That method can be greatly improved with better data science.

How bad is the problem of flawed race estimation? In a test using Florida voter registration data (one of few publicly available datasets that include ZIP code, name, and race/national origin), a Zest AI-built race estimation model identified Black consumers with 30 percent more accuracy than BISG at the 80 percent confidence threshold. It also cut the numbers of white consumers identified as non-white by 70 percent. In a test conducted by the Harvard Computer Society’s Tech for Social Good program using North Carolina voter data, Zest AI’s algorithm was more than twice as accurate at identifying Black individuals and 35 percent more accurate at identifying Hispanic individuals than BISG.

Multiply those numbers nationwide, and we’re talking about reclassifying the race and ethnicity of tens of millions of Americans. With a more accurate count, we would better know the scope of the equity problem and the efficacy of our solutions. We believe that the FHFA should require the GSEs to use better race estimation techniques. We’re happy to give ours away and would be honored to collaborate on improving it.

Meaningful action no. 4: Create or expand on a program allowing GSEs to approve more loans each year for people likely to repay who may not fit within the existing credit box 

No one wants to see a repeat of the housing crisis, which came about due to an inordinate amount of systemic risk in the US market. But some responses to the crisis have inhibited the Enterprises’ ability to say yes to good borrowers who may not meet all of the existing criteria for conforming loans. In other words, the real problem was product risk, not credit risk.

To some extent, the FHFA and Enterprises have recognized that creditworthy borrowers should not be penalized for those product risks, and accordingly have proven they can innovate to achieve more inclusion. For example, Freddie Mac’s Home Possible program allows first-time homebuyers without a credit score to qualify for a mortgage, provided certain loan-to-value ratios do not exceed 95 percent and borrowers meet a handful of other requirements in its lending guide.

The FHFA and Enterprises need to strengthen existing programs (like Home Possible) to prove that we can safely extend the benefits of homeownership to more Americans. Less stringent underwriting requirements can be backstopped by more accurate AI-driven risk models trained on a richer data set including cash-flow history and rental payments. These programs can also include wrap-around financial education and credit counseling. By eschewing “conforming criteria” box cuts and focusing instead on proving good repayment outcomes over say, a 5-year period, we can establish new avenues to wealth creation for millions of Americans. The inability to afford 20 percent downpayment, as opposed to 10 percent, is often the result of generational wealth disparities; those disparities should not be perpetuated when we have the ability to identify and lend to folks that will and can afford to repay. The FHFA and Enterprises have the power to facilitate these methods of equity creation.

We look forward to engaging with the FHFA on all these issues.

___________________________________

 

Teddy Flo — Chief Legal Officer 

Teddy lives in Thousand Oaks, California with his wife, two kids, and cat and is a passionate advocate for financial inclusion. In his role as CLO at Zest AI, he ensures that the company’s technology will not only enable fairer lending outcomes, but also boost the standard of what it means to lend more equitably. Teddy works across Zest AI to manage risk and leads the company’s legal, compliance, and policy teams.

Latest Articles