Apple’s credit card may discriminate, just like lots of banking algorithms
Share Now on:
Regulators in New York are investigating the new Apple-Goldman Sachs credit card for allegedly discriminating against women. Critics say this is just the latest example of bias in the algorithms that power so many financial decisions these days. Even when you sit down with a banker to get a mortgage, algorithms still likely affect whether you get a loan and how much interest you’ll pay.
Adair Morse is an associate professor of finance at University of California, Berkeley, and studies the bias in those algorithms. She said they weigh thousands of variables, including where you went to school.
“The name of your high school is correlated with wealth,” Morse said, “but it may over penalize particular ethnic groups or particular race racial groups.”
Morse found that discrimination, directly from people but also from those computer models, cost African Americans and Latinos an extra $765 million a year in mortgage interest.
“What these studies tell us is that algorithms are not immune from discriminatory actions or from creating clearly discriminatory outcomes,” said Nikitra Bailey is executive vice president with the Center for Responsible Lending.
But there is promising news: Morse and her colleagues found that lenders using just algorithms discriminate 40% less than loan officers who make decisions face to face.
If you’re a member of your local public radio station, we thank you — because your support helps those stations keep programs like Marketplace on the air. But for Marketplace to continue to grow, we need additional investment from those who care most about what we do: superfans like you.
Your donation — as little as $5 — helps us create more content that matters to you and your community, and to reach more people where they are – whether that’s radio, podcasts or online.
When you contribute directly to Marketplace, you become a partner in that mission: someone who understands that when we all get smarter, everybody wins.