🖤 Donations of all sizes power our public service journalism Give Now
What a privacy organization and Big Tech’s lead lobbying group think about internet regulation
Mar 27, 2024

What a privacy organization and Big Tech’s lead lobbying group think about internet regulation

Megan Iorio of the Electronic Privacy Information Center and NetChoice's Carl Szabo discuss what they think the ideal federal law to protect minors online would look like.

When you look at the lawsuits aimed at blocking attempts to regulate tech, it’s usually not companies like Meta or Snap doing the suing. Oftentimes, it’s a group called NetChoice, which has emerged as Big Tech’s top lobbying force from Capitol Hill to the courts.

Today, a conversation with NetChoice General Counsel Carl Szabo and Megan Iorio, senior counsel at the Electronic Privacy Information Center, a nonprofit focused on privacy. They occasionally agree, but very often they do not.

Case in point: the California Age-Appropriate Design Code Act, which requires websites that children are likely to visit to provide privacy protections by default. It was set to take effect in July, but so far, Szabo’s group has successfully blocked it in court.

Marketplace’s Lily Jamali sat down with Szabo and Iorio and asked about how their groups interact. The following is an edited transcript of their conversation.

Carl Szabo: When it comes to protecting kids and protecting kids online, I think we all agree raising a kid in the 21st century is challenging. And at the end of the day, we want to find something that works and that is constitutional. So yeah, I mean, we’re constantly talking at NetChoice to people on all sides of the issue, because at the end of the day, we want to reach the right outcome because we want something that works for everyone.

Lily Jamali: And Megan, what would you say to that? Can you give us some insight into what the relationship is between your group and Carl’s, and do you talk?

Megan Iorio: Well, at EPIC we think that one of the best ways to protect kids would be to pass comprehensive federal privacy legislation. So in this case before the Supreme Court, we may have some similar views on the constitutionality and impact of the Texas and Florida laws. But we are on opposite sides when it comes to the California Age-Appropriate Design Code [Act].

Chris Marchese, center, who directs the NetChoice Litigation Center, speaks outside the U.S. Supreme Court on Feb. 26 about a key case concerning Texas and Florida’s social media laws. (Andrew Caballero-Reynolds/Getty Images)

Jamali: Well, let’s talk about that law. This was a law that was meant to require data safeguards for underage users. Carl, you took California to court over this law, arguing that it violates a company’s free speech. Can you lay out your argument for us?

Szabo: Yeah. It’s a great question because what we saw out of California was a way overly broad, unconstitutional attempt to control speech online. One of the fundamental problems with the internet is — you know the, the old meme “On the internet, no one knows you’re a dog.” Well, the same thing is true with age verification. And in order to identify that people are adults, it would require massive data collection on a level we’ve never seen before so that businesses can perfectly prove and identify that you are who you say you are and you are as old as you say you are. So from a privacy perspective, it’s terrifying. From a constitutional perspective, there are clear prohibitions on what can and cannot be shown online. So essentially, it says business or websites may not show content that is “not age appropriate to someone under the age of 18.” Well, we’ll start off with the word “age appropriate.” I talked to my wife, she’s a child therapist, and she said, “Well, in the clinical world, we don’t use the word ‘age appropriate’ because age really doesn’t mean anything. Because you have a lot of 13-year-olds who are very mature, you have a lot of 17-year-olds who are really immature, so age really doesn’t mean anything. Instead, it’s ‘developmentally appropriate.'” That’s what they use in the psychological and psychology fields. So right there, you’re putting a chill on free speech. So we brought a case under the First Amendment, we won at the district-court level with a democratically appointed judge. And it is now before the Ninth Circuit, and we are not alone in this.

Jamali: Well, yeah, you’re not. Megan, tell me about EPIC’s approach. Your group has a different way of interpreting the passage of this law, which is now in the appeals process.

Iorio: Right. So there are a number of states that have passed laws that require social media companies to verify the age of their users. The California law is very different from those laws. First of all, it doesn’t require any kind of age verification. Essentially they just need to use the data they already collect and already use to estimate users’ ages to deliver them ads to also give them heightened privacy protections. That’s not a big ask. That is not unconstitutional.

Jamali: All right. And Carl, where do you see things moving next with the California Age-Appropriate Design Code Act?

Szabo: So right now, it’s before the Ninth Circuit, they’re going to look at it. Based on 200 years of Supreme Court precedent, the California case against video game violence that the state of California lost about 20 years ago, the fact that you have all of these groups who are typically not friends with the tech industry, like the news industry, siding with the tech industry against it, I think it’s pretty clear how the judge is going to rule. It’s going to rule that the law is a violation of the First Amendment. And that kind of gets into when it comes to banning speech or controlling speech, the government has to be crystal clear because if the government’s gonna forbid us from speaking, we need to know what we can and cannot say. And then, even if it does, it may still be unconstitutional. But unfortunately, the law in California doesn’t even achieve the specificity needed to begin banning speech online. And that’s essentially what this is all about.

Jamali: Well, let’s move on to the state of Utah, which was also taken to court by your group, Carl, NetChoice, to halt regulation there that would require social media companies to verify users’ ages and get parental consent for minors. Megan, what would you argue are some privacy concerns with this state law?

Iorio: So all of the state laws that require age verification are definitely problematic. And that’s because there is no privacy-protective way to do age verification at this point. And so EPIC does not support these laws.

Jamali: And this is an area where, Carl, you and your organization might agree, it seems?

Szabo: Oh, absolutely. I mean, in order to verify you’re over 18, it requires massive amounts of data collection.

Jamali: So I guess the question, and I’ll start with you on this, Carl, is what should we do?

Szabo: I mean, our parents had similar problems with video games and cable TV. And what we need to move away from is this attitude that “parents can’t.” Telling parents they can’t do something makes it harder for them to believe in themselves and empower themselves to do the right thing. So —

Jamali: Carl, let me interject because parents will say that Silicon Valley companies that you represent are putting the onus on parents.

Szabo: Absolutely. And it’s a parent’s responsibility to decide what is best for their kids and their family. Simultaneously, more states should follow in the footsteps of Virginia [which] has enacted a law to require digital education as part of the school course curriculum.

Jamali: Yeah, Carl, I do want to return to this issue of the role that parents play. Certainly, parents have some responsibility for their kids, of course. I think we can all agree on that. But the issue that is being brought forth by people like Sen. Amy Klobuchar, the Democrat from Minnesota who sits on the Senate Judiciary Committee and who grilled a number of tech CEOs about two months ago now on this issue of kids’ safety online — they will say that Silicon Valley companies are trying to wriggle out of any responsibility by placing the responsibility on, on parents in large part. Let me actually play a clip that we have from a recent interview that we did with the senator.

Amy Klobuchar: And that is everything from hooking kids on these products to targeting the kids with algorithms to exposing them to content that is damaging and that may actually result in them either taking their own lives or getting a drug overdose. I think those are pretty straightforward desires from the parents in this country.

Jamali: Can I just get your response to that?

Szabo: Yeah, those stories are absolutely horrible and depressing. And that’s why NetChoice ends up bringing these lawsuits, because an unconstitutional law will protect absolutely zero children. And that’s why we’re constantly working with state lawmakers, federal lawmakers and parents to find approaches that can work and do work.

Jamali: Thoughts from you, Megan?

Iorio: Yes. So NetChoice is here, it exists to fight for its Big Tech members. And they do that by pushing for very broad, sweeping immunity from regulation. And that’s what you’re hearing here. What Carl is talking about is, look, my members are not responsible for doing anything to address these harms. It’s all on parents. Parents have to, in addition to their day jobs and in addition to taking care of their kids when they get home from work, they have to research how our platforms work, even though we don’t even disclose how exactly they work, and to figure out how to prevent their kids from encountering the harms. But that’s not how things should work. The government should be able to pass regulations to prevent harms to kids. NetChoice is out there arguing that the First Amendment pretty much makes them immune for many of these regulations, from privacy regulations in particular. In the NetChoice v. [California Attorney General Rob] Bonta briefing, NetChoice is very clear that they don’t think that any privacy law that actually does something to protect users from privacy harms, that has data minimization requirements and use restrictions, would pass constitutional muster. And that is disturbing. They also think that any regulation of design-mediated harms, like addictive algorithms and dark patterns, they think that all of those would be unconstitutional. So it really is this push to ensure that the internet remains this lawless place where kids and parents are just scrambling to figure out what to do and tying the hands of the government from doing anything to protect Americans.

Mark Zuckerberg (left), CEO of Meta, speaks to victims and their family members during a Senate Judiciary Committee hearing on Jan. 31. The committee heard testimony from the heads of the largest tech firms on the dangers of child sexual exploitation on social media. (Anna Moneymaker/Getty Images)

Jamali: Is there a future in which NetChoice and EPIC, your two organizations coming at this from very different places, in which you work together on these issues? Is that on the table at all?

Szabo: I mean, I absolutely would love to work with anyone and everyone who wants to make it easier for parents, to make it easier for teens to get access to the information that they want, to empower parents to make the decisions that are best for their families and their children. And one of the things that was alluded to is creating a comprehensive privacy law that applies to the entire nation. Because today, when you travel from, let’s say, Connecticut to New York, to New Jersey, you’re gonna be subject to three different privacy regimes. And that seems kind of absurd in the way that Americans move. So there’s a lot there. The last thing that I would note is there’s a piece of legislation out there right now that doesn’t get enough attention. It’s called the Invest in Child Safety Act from Sen. Ron Wyden [Democrat from Oregon]. And what that does, it actually will provide law enforcement the tools and resources it needs to actually lock up child predators who engage in really bad actions online. So that’s a quick, easy solution to addressing one of the most egregious actions we see online, and it’s a way to address and enforce the laws online as they apply offline.

Jamali: And Megan, is there anything you want to say?

Iorio: Look, there are a lot more harms happening to kids than [child sexual abuse material]. EPIC supports strong privacy and consumer protection laws. NetChoice has so far not. And I think it’s pretty clear from what Carl said, but I just want to unpackage it a little bit, is that his solution is education for kids, education for parents, and no obligations for platforms. That just won’t do it.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer