Your data in action

Teddy Flo
January 24, 2024

Responsibly increasing economic equity through technology

Know your credit history, part three

“Know your credit history” is a three-part series that tells the stories behind the financial structures that shape how we make, lend, and borrow money in the United States today. Zest AI is making lending smarter, more efficient, and more accessible. Knowing where our systems of credit have come from is the first step in addressing their shortcomings and moving towards creating fairer and more inclusive credit for all.

We live in a day and age where everything is personalized. Companies advertising on streaming services allow you to choose between three ads, AI can add new but similar songs to your playlists, and many restaurants can accommodate just about any diet restriction a person could have. Obviously, some of these personalizations are a little more important than others.

Then there’s the other side of the coin, too. Personalization is great, but some levels of personalization can feel invasive due to the data they rely on. You see it pop up when you download a new app on your phone: a request to track your activity outside the app to ‘better personalize your experience.’ It makes one wonder, “What data are they gathering about me, and exactly how are they using it?”

Data collection, analysis, and use can be done right. In the financial services industry, we’ve set up a plethora of laws and regulations around how to properly use data — and how technology uses that data — to protect consumers from data misuse. Let’s talk about why and how that’s been done.

Legal limitations and allowances concerning data in credit scoring

For credit underwriting, lenders should rely only on accurate data sources or data sources subject to rigorous compliance review. We know that the Equal Credit Opportunity Act (ECOA) prohibits using data like race, ethnicity, national origin, religion, sex, and marital status to make lending decisions. Nor is it permissible to use proxies for these variables. In addition, various courts and civil rights organizations have weighed in on other features like criminal history or medical debt that are believed to introduce unlawful bias into decision-making at financial institutions.

Consumer Reporting Agencies (CRAs) tell us the basics about what variables they consider in their proprietary credit scoring calculations: the number and types of accounts someone has, credit usage percentages, length of credit history, collections activity, and payment history. Sometimes, “alternative” data sources may include things from public records like bankruptcies, foreclosures, utility payments, and other features.

Where alternative data should come into the picture

For those with enough credit history, we’ve found that it’s not always helpful to add in more data. Usually, for more accurate credit risk assessments for these individuals, running their data through a better credit model, like AI-automated underwriting, is the way to go.

Here’s an example of what that looks like: rather than just looking at delinquency documentation on someone’s file — which is about the most that a traditional credit risk assessment can handle — AI-automated underwriting processes all the information lying underneath the delinquency: the time between events if there are multiple, the length of time since the most recent event, and even looking further into the past for other patterns of delinquency. When AI-automated underwriting looks at an application with delinquencies, it can infer that they all happened in a short timeframe with none before or after — signaling the possibility of a temporary setback for that borrower. This is something that AI models can help someone overcome. Using math to fill these gaps predictably can solve many credit inequities that traditional risk assessments can’t.

But not all folks have a credit history. Credit-invisible and “unscorable” individuals make up about 19 percent of U.S. adults. This research also shows that 26 percent of Hispanic consumers and 28 percent of Black consumers are credit invisible or unscorable, compared to 16 percent of White and Asian consumers. This trend also continues with credit scores themselves — where Black and Hispanic consumers have lower credit scores than White or Asian consumers — but this disparity can be improved by removing bias still baked into traditional credit scores.

Alternative data comes into the picture when necessary. In the case of almost 50 million people in the U.S., alternative data can fill in the gaps while they build up their credit history. While lenders should be aware and careful when using this data since there’s a higher risk of inaccurate and incomplete data, less compliance testing, and more regulatory scrutiny, there are ways of doing so in a compliant manner that benefits consumers greatly. If used appropriately, alternative data can be a helpful tool in expanding access to credit.

As a final note, the CFPB — alongside state and other federal agencies — is looking to further protect consumers from inequitable data practices, both in lending and more broadly. We’re excited to see the emphasis on protecting individuals and their privacy, but you’ll hear more from us about how to balance improving equity in our lending decisions and giving consumers greater privacy measures.

The upshot — responsible AI, coupled with responsible data sources, can increase financial equity in America

From custom AI credit models to supplementing alternative data to help with credit decisions, the importance of personalization has extended into credit and how we use consumer data to identify risk levels in lending. While we have to be careful how we use alternative data in our scoring structures to ensure that we don’t unintentionally introduce bias into our decisions, it’s proven to be a helpful tool from the modern-day to bring greater equity to lending.

Purposeful AI and responsible data sources lead to explainable and transparent credit decisions. These decisions are the opportunity we, as leaders in the financial services industry, must take in order to increase access to equitable lending in America and enable people to lead better, richer, and fuller lives.

Read the prior two blogs of Know Your Credit History here: Part 1 – Who’s keeping score?; Part 2 – Drawing the line

 

___________________________________

 

Teddy Flo — Chief Legal Officer 

Teddy lives in Thousand Oaks, California with his wife, two kids, and cat and is a passionate advocate for financial inclusion. In his role as CLO at Zest AI, he ensures that the company’s technology will not only enable fairer lending outcomes, but also boost the standard of what it means to lend more equitably. Teddy works across Zest AI to manage risk and leads the company’s legal, compliance, and policy teams.

Latest Articles