Algorithms can be biased. Could auditors lend a hand?
Jun 24, 2022

Algorithms can be biased. Could auditors lend a hand?

HTML EMBED:
COPY
Most auditors are in favor of publicizing the data in favor of transparency, according to new research.

This week, the Department of Justice settled a lawsuit with Meta, Facebook’s parent company over the use of algorithms the government said were discriminatory. That served housing advertisements to users who, in the company’s words, “look alike.” You can probably see the problem.

Meta said it will change its targeted ad tool and subject it to what’s called algorithmic auditing.

Sasha Costanza-Chock is a researcher with the Algorithmic Justice League, who just co-authored a paper on algorithmic audits. She says audits can be done within companies, by contractors, or by outside parties like researchers and journalists. The following is an edited transcript of their conversation with Meghan McCarty Carino.

Sasha Costanza-Chock smiles in front of a grey background.
Sasha Costanza-Chock (Courtesy Caydie McCumber).

Sasha Costanza-Chock: Many companies are building these first-party teams internally, because they would like to do the best job with their systems, and also, before they harm people in the real world. First-party auditors have a lot of access to all of the data and the systems that are being used. So in theory, you know, they should be able to do a really excellent job. But the problem is, if they find something wrong, they’re not disclosing that to the public, and so we have no idea what they’re finding, and whether the companies are acting to correct the problems that they found or not. The second-party auditors almost always sign non-disclosure agreements, that don’t let them again, sort of publicize or reveal problems that they found. And so the only ones that are doing audits where they’re going to share that with the world, that would be third-party auditors, but the third-party auditors have the least access to the system that they’re investigating.

Meghan McCarty Carino: Now, another problem that you identified is that there aren’t exactly standards in the methodology of these assessments. I mean, how much variation did you find?

Costanza-Chock: We found a very wide range of variation. Most of the auditors look at things like the accuracy or fairness on training and sample data. But only 60% of them say that they’re looking at the security and privacy implications of the algorithmic system. And only about half of them say that they’re checking to see whether companies have quality systems to report harm. So if an algorithmic decision actually causes harm to somebody, so let’s say someone is unfairly denied the ability to rent a home, because the algorithm that screens tenant applications is racially biased, which is a real thing that’s happening in the world now. Well, only half of the auditors right now say that they’re looking for a way for people who’ve been harmed to report that harm back to the company. That’s something that we think should be important to look at in any audit. But it’s not happening all the time.

McCarty Carino: Right, to what extent does it look like these audits are actually having their intended effect of sort of improving transparency and accountability?

Costanza-Chock: It’s extremely difficult to tell right now, because of the lack of disclosure that we talked about before. That could change. There are now increased calls for regulators to carefully audit systems. So for example, the Federal Trade Commission can potentially evaluate algorithms as to whether they are discriminatory. And if they find that an algorithmic system was developed without consent, they can actually order the company to destroy the data set on the algorithm. So there are cases where a government regulator has power and access but we need a lot more of that.

McCarty Carino: I mean, looking to the news this week, we see Microsoft announced its own responsible AI standard, it says it will limit features of its facial recognition technology. I mean, should these kinds of decisions be in the hands of companies themselves like this?

Costanza-Chock: We absolutely can’t rely on companies to just do the right thing. Even the ones that want to do the right thing. They’re in a competitive environment. If they spend a lot of time and energy and resources on auditing their systems and ensuring that they’re least harmful as possible, then they’ll be at a competitive disadvantage. And so, even the companies themselves are asking for regulators to step in and set the playing field and the auditors themselves that we talked to almost none of them were able to actually share audit methods or results with us. But almost all of them said that they thought that that should be a legal requirement.

Sasha Costanza-Chock just co-authored a paper with AJL founder Joy Buolamwini and Deborah RajiIt’s called “Who Audits the Auditors?” It identifies several areas to improve auditing effectiveness, including better systems for reporting the harms of biased algorithms in the real world, and gathering feedback from communities at risk of harm early in the process.

OR if visual learning is more your thing, The Algorithmic Justice League has produced a helpful companion video to the paper, narrated by Joy, who was on this show last year talking about bias in facial recognition software.

The Algorithmic Justice League video summary of “Who Audits the Auditors?” narrated by Joy Buolamwini.

Speaking of which, we’ve got more on Microsoft’s recent decision to restrict the use of some of its facial analysis tools, like one that purported to identify a person’s emotional state by analyzing facial expressions, but came under criticism for bias and inaccuracy.

I mean come on if it were that easy to tell what someone was feeling just by looking at their face, the whole genre of romantic comedy would cease to exist!

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daniel Shin Producer
Jesús Alvarado Associate Producer