Consumer rights aren’t guaranteed in a digital world, warns Consumer Reports CEO
Consumer rights aren’t guaranteed in a digital world, warns Consumer Reports CEO
In today’s increasingly technology-driven economy, consumer rights are becoming harder and harder to enforce due to digital tools that regulations haven’t yet addressed.
That is the message in “Buyer Aware: Harnessing Our Consumer Power for a Safe, Fair, and Transparent Marketplace,” a new book by Marta Tellado, president and CEO of the nonprofit Consumer Reports, which does product testing and other consumer advocacy work. As the use of algorithms and artificial intelligence increases, she says, so does the risk of inequitable practices that consumers don’t know about.
“You have to make decisions every day, based on algorithms and things that you can’t see, feel or touch,” Tellado said in an interview with Marketplace’s David Brancaccio. “And that’s incredibly challenging for everyday consumers.”
The following is an edited transcript of their conversation.
David Brancaccio: There’s someone who may think that your job is to review robot vacuum cleaners or remind us yet again that Toyotas are good. But you see your professional work as a civil rights crusader at some level?
Marta Tellado: Well, that’s right. I think a lot of folks do know us because of the ratings. But a lot of what we do is really to shape the marketplace to create a more fair and safer marketplace. And in the book “Buyer Aware,” what I really tried to do is tell a larger story about our democracy, that it can only flourish if we have a marketplace that does just that: is more fair and more just for all consumers.
Brancaccio: All consumers — no one wants to be exploited by companies with which we interact. But additionally, and I think crucially important, reading your book, you see an equity issue. The exploitation is worse for some people and not others.
Tellado: Well, that’s right. And I think for me, the seed was planted as a young immigrant child coming into the United States after [the] revolution in Cuba and seeing my parents have to rebuild their economic life and having a very firsthand experience — and having a tremendous gratitude being able to come to a democracy. But you also see firsthand that economic freedom is a civil right, that you can’t have a fair market if there is inherent bias. And when our economic power and our agency are undermined, so is our power to function as a free and equal member of our democracy. So the book tries to pull out some examples, some really egregious examples, that demonstrate just how powerful that is, and the strong link between having a free and fair democracy and economic opportunity in the marketplace.
The challenge of digital consumer rights
Brancaccio: And when you think about civil rights and equity as it applies to consumer rights, I mean, I think some people talk about redlining of loans or access to broadband internet, which is so uneven in, for instance, Native American tribal lands. But it goes much deeper. I mean, you worry a lot about emerging technology, artificial intelligence, and the biases that are built in?
Tellado: Well, absolutely. We’ve been so proud of all the work, we’ve done over 86 years to codify the laws and the rules for fairness and justice in the marketplace. But the unfortunate thing is that a lot of those rules don’t apply to this new digital landscape where we don’t have transparency. And you have to make decisions every day, based on algorithms and things that you can’t see, feel or touch. And that’s incredibly challenging for everyday consumers.
Brancaccio: I spoke with the chair of a computer science department at an engineering school. And he pointed out to me that with machine learning, even computer scientists can’t reverse engineer their own system to fully understand why a machine decided to do one thing and not another. And you can just see how that could lead to abuses and perhaps discrimination.
Tellado: That’s exactly right. So when you think about it, machine learning: bad data in you get a bias out. And I devote a whole chapter to really digging into what does it mean when an algorithm discriminates against you? Sometimes it’s a life-and-death situation. Let’s think about how medical bias shows up in an algorithm. If you think about something as serious as going to the doctor and finding out that you have end-stage kidney disease, that means that you’re going to need a transplant. And we know that transplants are in short supply, you have to get on a national waiting list. How does that happened? Well, for all of us, you have to qualify. And the way you qualify is you need to get a score of 20 or below, and that’s a test based on your medical data that demonstrates how fast your kidneys are filtering blood. But here’s the glitch, that if you are Black, your score is adjusted. There is a race-adjusted coefficient based on a faulty study done in the ’90s — bad data. And that assumes that Black people might have different kidney functions. So let’s call this patient Eli. He goes in for the test. He doesn’t get a 20 or below and he does not make the cut. And that’s a life and death situation there was an algorithm that is not transparent to the patient is making a decision about that patient’s ability to access medical care, and in this instance, life-saving medical care.
Brancaccio: Now I know you look for this in your book — I don’t think you found it. What federal rule governs fairness in artificial intelligence?
Tellado: Well, unfortunately, there is no federal rule. As proud as we are of the work we’ve done on consumer rights and protections, those rules and regulations and protections for consumers do not migrate to the digital landscape. That rears its head in many ways, but artificial intelligence is not one of them. And we can have terrific leaders in our agencies right now, but we don’t have the tools or the capabilities, or the guidelines that provide more fairness and transparency. And you can’t trust something that is not transparent to you. And you certainly can’t hold it accountable.
Potential discrimination … hidden in algorithms
Brancaccio: And Consumer Reports gets involved in the public policy process — you make your organization heard on a matter like that?
Tellado: Well, so many folks come to us because they’re making individual choices. And what we look at is, “How do those choices ladder up to the marketplace more generally?” We do have a digital lab that looks precisely at bias. And we’ve also looked at car insurance, and you assume your car insurance is based on your driving record the fact you don’t have tickets or you haven’t had that fender bender. But in fact, the algorithm is also looking at non-driving factors about you: where you live, what your income is, what your education level is. And so what we discovered is that the price that you pay for your car insurance has more to do with your zip code and whether that neighborhood happens to be Black, white, or Hispanic, and determines what you’re paying on that premium. And so a Black and Hispanic neighborhood is paying a higher premium than a white neighborhood based on that bias.
Brancaccio: I mean, even language barriers where companies may not communicate crucial information in languages that people are conversant in.
Tellado: And as you say, David, in some of these examples, the stakes are really high. They’re life and death. And I’ll give you an example of something that’s not quite an algorithm. But it’s, it’s a product. It’s a product that everybody is familiar with. Anytime you go into a hospital and you see that little clip on somebody’s finger, that’s a pulse oximeter. But it doesn’t work as well on darker skin tones. And what we know is that it’s three times more likely to miss low oxygen levels on people of color than it is if you are white. And the implications are pretty astounding, because you show up at the ER, and they test that. And if you don’t have the score, you turned away the ER and that was really remarkable given what we just went through in the pandemic. And we know that many people of color were turned away and did have disparate impacts. So once again, fairness by design is something that we also look at that Consumer Reports
Brancaccio: You can see the engineers for these devices saying, “We never thought to check or to think that way.” And you’re demanding, “You got to start thinking that way, companies.”
Tellado: Well, that’s right. Another area where we see a lot of bias, and we’ve known this for a while, is that we know that women are impacted and injured much more greatly from car accidents and impacts in your car. And we know that the reason is because our biologies, our bone structure, is very different from a male physique. But the test dummies are not anatomically correct. They are based on men and how men are impacted by forces in an accident. That, to us, is the battle we still have to fight. But that’s part of the work that we do — we test these products because we have to change and eliminate the bias and create products for all.
Brancaccio: Marta, do you ever get pushback on some of this? Maybe readers who want you to stick to telling them if the blender is good or not?
Tellado: We get that all the time. I think people say, “Well, wait a minute, you know, stay in your lane, you know, just tell me what kind of car to get or what kind of blender.” But the stakes are too high, David. What people don’t know is that, as you said, we’re a nonprofit organization. We’re not a private publishing company. We are powered by our members, much like public radio. And so we are a public good. We collect data because we want to make the world and the marketplace a better, more fair and transparent place for consumers. And so our work really helps power and create the foundations of a lot of the safety requirements that you see and you take into your home. The burden is still on us in terms of the hardware products we bring home. And the burden is really on the consumer in the digital landscape. So now we’re living in a world where our privacy and our data safety is a setting, not a right.
There’s a lot happening in the world. Through it all, Marketplace is here for you.
You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible.
Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.