Race and Economy

How mortgage algorithms perpetuate racial disparity in home lending

David Brancaccio and Rose Conlon Aug 25, 2021
Heard on:
HTML EMBED:
COPY
A new investigation finds that people of color are more likely have their mortgage applications rejected than similar white applicants. Glenn Hunt/Getty Images
Race and Economy

How mortgage algorithms perpetuate racial disparity in home lending

David Brancaccio and Rose Conlon Aug 25, 2021
Heard on:
A new investigation finds that people of color are more likely have their mortgage applications rejected than similar white applicants. Glenn Hunt/Getty Images
HTML EMBED:
COPY

Lenders are more likely to deny mortgage loans to people of color than to white people with comparable financial profiles, according to new reporting by the investigative news outlet The Markup. Racial bias was present even after reporters controlled for factors like income and neighborhood, as well as factors that lenders previously said would explain the disparities: debt-to-income ratio and combined loan-to-value ratio.

The reporters could not control for credit scores due to public data limitations, but government regulators have determined that credit scores alone do not account for racial disparities in lending.

After analyzing more than two million conventional mortgage applications from 2019, reporters Emmanuel Martinez and Lauren Kirchner found that, nationwide, lenders were 40% more likely to reject Latino applicants, 50% more likely to reject Asian and Pacific Islander applicants, 70% more likely to reject Native American applicants and 80% more likely to reject Black applicants than financially comparable white applicants. And the disparities were even more dramatic in cities like Waco, Texas, where Latino applicants were over 200% more likely to be denied than their white counterparts.

In an interview with Marketplace’s David Brancaccio, Martinez said that biased algorithms are one crucial driver of the disparities.

“There’s still certainly a human element to all of this, but more and more, [lending] is becoming more algorithmic-driven. Loan officers collect all the financial characteristics of a prospective borrower, and then input that into an algorithm. But when you consider things like wealth and assets, research shows that white families have eight times the amount of wealth as Black families. And so, when you use that as a barrier to entry, it’s going to disproportionately affect Black families and families of color,” said Martinez.

The following is an edited transcript of the interview.

Emmanuel Martinez: I found that people of color are more likely to be denied than their white counterparts, even when they look financially the same. And [that includes] factors that the lending industry said would explain them away. For context, I’ve been looking at this topic for the last four years, I’ve been working with this data extensively. And the first time that I published this analysis, I found the same thing. Lenders told me that they weren’t denying people of color because of their race, they were denying them because of things like debt-to-income ratio and combined loan-to-value ratio. And now, with this analysis, I have those two ratios — and I still find that people of color are denied at higher rates, even when including those two important financial characteristics.

David Brancaccio: Just so we’re clear. People who look just about the same on paper, the one difference: are they white or are they people of color — that seems to skew the chances that you’ll get a loan or you’ll get denied?

Martinez: Yes. I found that Latinos nationally are 40% more likely to be denied. And the worst is Black applicants, they are 80% more likely to be denied than their white counterparts, even though they have the same financial characteristics.

Regional variations in the data

Brancaccio: You also found regional variations: that Latino applicants in Waco, Texas were 200% more likely to get their applications rejected.

Martinez: Yeah, I found that Waco, Texas, was the worst place, according to my statistics, for Latinos. But that differs from like, say, Boston, where they’re 70% more likely to be denied. So it just depends on where the person of color is. In some places, they’re closer to the parity, and in other places — like Waco, Texas — there’s a wide disparity between Latinos and their white counterparts.

Brancaccio: Yeah, you found that in Chicago, Black borrowers were 150% more likely to get denied a loan, so it really does vary.

Martinez: Yeah, it varies significantly by the region. When looking at the biggest cities in America, Chicago was the worst for Black applicants — and Chicago is especially interesting because a lot of the redlining history goes back to Chicago. But then when you look at places like Denver, Colorado or Sacramento, California, [Black applicants are] 60% more likely to be denied. So there are definitely places with disparities, but then there are places where the disparity is a lot worse.

Brancaccio: Now, I saw you cited the American Bankers Association. They said the data that you could get still had limitations, and they don’t believe you’ve made the case that the system discriminates.

Martinez: Right. Their updated statement is that it’s still about credit score and credit histories; that, if I had that, then that would explain the disparities. But when the CFPB, when the government looked at that particular metric, they found that, when they hold credit score constant, people of color still are denied at higher rates than their white counterparts. So credit scores don’t explain away the disparities completely.

What’s driving the disparity?

Brancaccio: You’ve been studying this for years now. What’s your sense of what is causing this? Is this loan officers who are personally biased? Is it something else? Is it many factors?

Martinez: It’s many factors. There’s still certainly a human element to all of this but, more and more, it’s becoming more algorithmic-driven. Loan officers collect all the financial characteristics of a prospective borrower and then input that into an algorithm. But when you consider things like wealth and assets, research shows that white families have eight times the amount of wealth as Black families. And so, when you use that as a barrier to entry, it’s going to disproportionately affect Black families and families of color. Other variables that have a disproportionate impact that Freddie and Fannie look at in their algorithms are the gig economy and who lists that as a primary source of income. People of color are more likely to list the fact that their primary source of income comes from the gig economy. And lenders don’t like that. They like stable income. That is another factor that has a disproportionate effect on people of color.

Brancaccio: Now, some critics of the system think this is a structural racism device. That how you engineer and how you implement the algorithms can exacerbate long-standing inequities.

Martinez: Exactly. For lenders, the decision revolves around risk, and they don’t want to lend to risky borrowers. And so that’s the philosophical debate that advocates are trying to introduce: That risk should not govern everything; that the conversation should be more nuanced.

Brancaccio: And advocates for change are not suggesting lenders throw out this notion of risk. Isn’t it that they’re trying to figure out innovative ways to assess who’s likely to pay back and who’s not without these factors that play to historical discrimination and historical bias?

Martinez: Yeah, and I think that that’s an important aspect to capture. They’re not saying we should throw these out, they’re saying we should consider more nuanced perspectives. For example, if someone can make their rent and be a consistent, on-time rent payer, wouldn’t that also signal that that person is probably going to pay back their mortgage? But rent payment isn’t something that a lot of these algorithms have considered in the past, up until recently. Once we started reporting on this aspect, Fannie announced that they are starting to consider rent payment histories as part of their decision-making process. That is starting next month, in September. And so that is kind of the nuanced conversation, that we should consider more variables that are more fair to people.

The mystery around credit scores

Brancaccio: So, some possibilities for change. But still, Fannie Mae, with all its power, still very much has a tradition of using what’s called the classic FICO credit score. And that’s something that’s been around for a long time, and it’s a little unclear exactly what goes into your credit score. They don’t tell you.

Martinez: Yeah, there’s a lot of mystery around credit scores. For example, the credit score that’s available to you and me through my banking app or through the credit bureaus isn’t the one that’s necessarily going to be used when a lender decides to approve or deny you a mortgage. There’s another formula that’s used to calculate your mortgage credit score. But Freddie and Fannie have stuck to using this 15-year-old algorithm, even though there are more fair ones that exist. The government, Congress, advocates, and even FICO itself has tried to get Fannie and Freddie to use more fair credit scoring algorithms. Even FICO has newer, updated credit scoring models, and they’ve advocated Freddie and Fannie to use those new ones, but Freddie and Fannie have resisted.

Brancaccio: But the reason we don’t really know what goes into your credit score is that the companies say they’re worried that we, the borrowers, will start gaming the system to our advantage and mess up their predictive quality.

Martinez: Exactly. Even the algorithms used by Freddie and Fannie, that decision is recorded and collected by the government but kept away from the public data. So I can’t see what decision these algorithms are making. And the reason why Freddie and Fannie told the government they should keep it out [of the public data] is because they didn’t want anyone reverse-engineering their decision. So there’s a lot of mystery surrounding these things. There is some stuff on how FICO constructs its algorithm and what percentage of income versus credit or debts are considered, but we only know from a very cursory standpoint. There’s no detailed information about how these things work out in the public.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.