We fell short of our Fall Fundraiser goal of 2,500 donations. Help us catch up ⏩ Give Now
New online age-verification tools  could exclude lots of adults
Aug 21, 2024

New online age-verification tools could exclude lots of adults

HTML EMBED:
COPY
Washington Post tech reporter Drew Harwell says a new cohort of tech vendors assess age with the help of users' images. If those images don't conform to typical physical features, will the user be shut out? And what exactly do they do with the info?

We might all agree that the internet can be a dangerous place for kids. Earlier this year, the U.S. surgeon general called for a warning label for minors on social media, and a growing number of states are requiring age verification for certain sites. Now, many platforms are adding a safeguard strategy that comes with its own trade-offs: facial scanning.

These systems use artificial intelligence to analyze visual clues, sometimes in conjunction with a government-issued ID, to keep those it deems too young from accessing a site. In the process, they collect all kinds of identifying data, and like any tool, these are susceptible to errors. In this case, errors could potentially bar adults from parts of the internet.

Tech reporter Drew Harwell recently wrote about the proliferation of these systems and the risks that come with them for The Washington Post. He told Marketplace’s Meghan McCarty Carino that they’re getting hard to avoid for internet users. The following is an edited transcript of their conversation.

Drew Harwell: A lot of the most popular websites on the internet, including Facebook, Instagram, TikTok, delivery services, gaming sites, social media sites, porn sites — all started running new users through this age-verification check. And what it looks like, basically, is if you’re creating an account or trying to change the birthday or trying to do something that only adults should be able to use on this site, it’ll ask you to look into your webcam or your phone camera and move the phone around a little bit so it can get a 3D view of your face, basically. And then, it says whether you are an adult or underage.

Meghan McCarty Carino: And then, of course, there are an increasing number of states that have passed laws that require age verification. How is that kind of figuring into these?

Harwell: Yeah, so there’s 19 states now that have passed laws, enacted laws, that basically require lots of different websites to run age-verification checks. And some of them say any site that has more than a third of its content, like, adult material. So this wraps in porn sites, but other kinds of sites as well will say you have to have this age check. But in some states, including Florida, they’re doing it for more than just porn. They’re doing it for social media where they say, anybody who wants to create an account, we want to make sure that they’re 15 or over. And if they fail that age check and they look like they’re underage, then we’re going to force them out or require some kind of parental consent. So it’s, it’s very different than how the internet has already always run, and it’s really kind of become something a lot of people are talking about.

McCarty Carino: So this has spawned kind of a little cottage industry of companies popping up to do this, right?

Harwell: Yeah, and these are companies you’ve never heard of. There’s Yoti, Incode, VerifyMyAge. They’re all basically middlemen. Companies like Facebook and Instagram and TikTok will pay these contractors to run the age check. And so they’re the ones who run the AI that scans your face. They’re the ones who sort of capture your face, or they ask you to hold up your ID so they can capture both of them and do the analysis. So the companies pay them to run these age checks. But I think one thing that people are a little unnerved by is that these are companies that are not household names that are now this very important, you know, verification layer, where they’re the ones getting a scan of your face or your ID, and you don’t really know exactly what they’re doing with the data besides what they promise. And so it’s just an extra layer of unease to the process.

McCarty Carino: You highlight an example of someone who had a problem accessing TikTok. Her name is Violet Elliot. Can you tell me about her experience?

Harwell: She’s a 25-year-old woman. She has a pretty popular TikTok account. She’s a woman with dwarfism, and so she talks about, you know, life with, with the disability, the discrimination she faces online. It’s really an interesting account. And all of a sudden, one day she found that her account was banned. She got an email saying, you know, the account was going to be deleted because TikTok believed that she was not over the age of 13. And TikTok treats, you know, under-13 accounts differently. And so, you know, this 25-year-old woman had, like, hundreds of videos disappear. And you know, this is an account she spent a lot of time, and so she got really worried because, you know, TikTok was starting to run these age checks, and she felt like, you know, I’m a woman with dwarfism. I look different than everybody else my age. Is TikTok treating me like a kid, even though I’m a full-grown adult?

And I actually went back and forth with TikTok to figure out what happened here. And TikTok says, well, actually this was human error. But the truth is, I mean, this is like one of many platforms that is running these age checks. And Violet had said she had other friends who also had dwarfism and who had gone through the same issues. So her feeling was, are these checks going to be looking for the stereotype image of an adult and penalizing anybody who doesn’t look like that? I just found that really interesting because the truth is even for people who maybe don’t have dwarfism or don’t have some disability, we all look different. Some of us look younger than we really are. Some of us look older. And when you turn on a system like this, that is going to keep people from accessing the tools that are really core to our society. You’re going to be potentially punishing people for doing nothing wrong.

McCarty Carino: What’s at stake when these systems make errors, or even if they just by virtue of having concerns drive people away from using these tools?

Harwell: I think, you know, when we’re thinking about social media, I think it’s a big deal. Obviously, the backdrop of a lot of these laws is the anxiety that lawmakers, parents have about how young people are using the internet. And there’s a feeling of, if only we had some little bouncer to keep kids away from these websites, everything would be OK. All of these issues that have unnerved us about kids on Instagram and TikTok would go away. But I think that what we’re seeing is that these accounts raise their own problems, and they may not really solve all the problems, you know, we activated them to solve in the first place because, you know, social media, it contains multitudes, right? There’s a lot of bad stuff on social media, but also for 15-year-olds in this country, it’s a place where they talk to their friends, where they learn about their world, where they catch up on the news, where they, you know, get involved with politics and organizing. And so when you kind of shut down all of those systems that were open to young people to begin with and kind of keep them out just by virtue of how old they look — not even necessarily their age, just how old they look — you start to, you know, have a fragmented internet that treats different people in a different way.

McCarty Carino: And even if these systems work as intended, are they enough to shield children from harmful content online?

Harwell: Yeah, that’s the big question, right? And you know, these laws have been challenged. And part of the issue has been how the laws have been crafted. If a state lawmaker is trying to target keeping kids away from porn, the state law might not be written in a way that would really block all porn sites, right? And some judges have said, well, these kids could still, you know, Google for porn and maybe not necessarily go to a porn site because the law was written in such a general way that it didn’t really keep everybody out.

More on this

Last year on the show, we discussed various methods for age verification, beyond just facial scanning, with Matt Perault and Scott Brennen from the University of North Carolina Center on Technology Policy. They’ve got a whole research brief on the issue and the trade-offs involved.

And it seems like next year the Supreme Court will likely weigh in on some of these issues. SCOTUS agreed to review a challenge to a Texas law that requires age verification and warning labels for porn sites. A trade association for adult content providers sued to stop the law, saying it infringes on their First Amendment rights. Arguments will begin in October.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer