Support our non-partisan non-profit newsroom 💜 Donate now
How two cases headed to the Supreme Court could change the internet
Jan 27, 2023

How two cases headed to the Supreme Court could change the internet

HTML EMBED:
COPY
SCOTUS may hear two cases on social media regulation and free speech as early as next year.

Earlier this week, the Supreme Court announced it was putting off deciding if it will hear a pair of highly anticipated cases that could fundamentally change social media as we know it.

The cases concern laws in Florida and Texas, pushed by conservatives in those states, that basically make it illegal for social media platforms to block or hide content — say from a former president — even if the post violates the companies’ terms of service.

Both laws have been blocked from taking effect while the rest of the country waits for the high court to weigh in.

Marketplace’s Meghan McCarty Carino spoke with Daphne Keller, director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center, about why these cases could be consequential.

The following is an edited transcript of their conversation.

Daphne Keller: The Texas and Florida laws both have what are known as “must carry” provisions. They both have an idea that there are certain kinds of content that platforms may not take down, even if their terms of service prohibit it or even if it is, say, hate speech or horrible things that are legal but nobody wants to see online. But they also, both of the laws, have transparency mandates to publish information about what their speech rules are in more detail and what decisions they’ve made moderating content and how much content has been taken down. So there’s this interesting secondary issue about when the government can compel platforms to disclose that really interesting and important information.

Meghan McCarty Carino: So who has been fighting against this, and what are their arguments?

Keller: Well, the platforms themselves, saying that the laws violate the platform’s own First Amendment rights, the rights to set editorial policies and determine what speech they do and do not want to carry. And a number of, you know, independent amicus briefs have come in on both sides supporting the platforms and supporting the states. But really the key argument that would go to the Supreme Court is this First Amendment question about whether platforms have an editorial right to moderate content that makes it impossible for Texas and Florida to enact these mandates.

McCarty Carino: So what could change if SCOTUS does consider and rule on these cases?

Keller: Well, anything could happen. I mean, it’s pretty remarkable that we have not had cases on these questions in the 25 or so years since the big laws governing internet platforms were enacted in the 1990s. In the U.S., we have a Supreme Court, most of whose members have never encountered any version of these questions, with a few exceptions, and who don’t have much background about what the issues are, what models have been tried before, and whether and why they failed. And so we both don’t have a record to guess the predilections of individual justices for most things. But we also have reason to expect that they, you know, won’t have deeply informed opinions and in the ways that they do about other kinds of topics that often reach them. And that, I think, sets us up for extremely unpredictable outcomes.

McCarty Carino: You noted that there are questions about whether states like Florida and Texas have the power to create regulations like this, but nevertheless, states are taking regulatory approaches to tech companies. Without a ruling from SCOTUS on this issue anytime soon, what do you think that means for this landscape as we get closer to a big election year?

Keller: So I think if we really did have a world where states can make laws like this, we would have a profoundly different internet, because platforms would have to apply different speech rules in different states. So if the plaintiffs win in Gonzales v. Google, that case, that’s about liability for terrorist content, they will create a situation where every different state sets different speech rules for all of the content you see in your news feed or in other ranked content online. And that’s difficult for platforms to enforce, because geotargeting each of the 50 states perfectly, you know, they can do it better than they used to be able to, but the technology still isn’t perfect. More importantly, their incentive would not be to try to figure out detailed state-by-state variations in speech laws, their incentive would be to figure out the lowest common denominator for what constitutes defamation, for example, and just apply that rule across the board to simplify things.

McCarty Carino: What’s the likelihood that they will hear these cases maybe next year during the middle of an election?

Keller: I think it is very high likelihood that they will hear these cases next year. It was already high because of the importance of the issues and the fact that two federal circuit courts disagreed profoundly on questions about whether states could compel platforms to carry content. It seems like the most likely timeline for that would give us a ruling from the Supreme Court in 2024, sometime in the January to June timeline, right at the beginning of an election year. And the later they issue the opinion, the farther into that election year we will be. It’s hard to predict what that means, in part because it’s hard to predict what they’re going to say.

McCarty Carino: Right. But I could imagine the decision itself becoming kind of a political football and also the effect of it having a strong impact on elections, as social media has tended to do.

Keller: Absolutely. If the court said Texas and Florida can enforce these laws — they can compel platforms to change their content moderation policies — one scenario is the platforms just give those states what they asked for and open the floodgates for a tidal wave of barely legal garbage to hit Florida users and Texas users. If that happened, on the one hand, it would be a very good educational moment about what it is that these lawmakers were actually seeking to compel. On the other hand, I’m sure that it would make people mad at the platforms in a new and different way. And the people who suffer because of that bad content, the people in Texas in Florida who are the victims of harassment or hate speech, they would suffer much more in that scenario, and that’s both bad in itself and potentially something with political consequences. Also, if Texas and Florida win, then the floodgates are much more open to electoral disinformation remaining on the platforms and the platforms not having much of a choice about whether to take it down. And so that certainly is relevant for the election.

The Supreme Court will hear two other cases in February that could have serious implications for the internet.

One of those, Gonzalez v. Google, specifically addresses algorithms that recommend content on sites like YouTube. It challenges whether those recommendations should be shielded by Section 230 of the Communications Decency Act.

That’s the statute from the 1990s that paved the way for our modern internet ecosystem by saying these sites are different than publishers and aren’t liable for the content users post on their platforms.

We spoke to Santa Clara University law professor Eric Goldman about some of the potential outcomes of a Supreme Court decision in that case, and he said even if the decision is narrow, meaning social media is still broadly shielded by Section 230, but their algorithmic recommendations aren’t, that could lead to platforms pulling back from curating content as they do now.

The other case, Twitter v. Taamneh, is focused on whether platforms can be held responsible for not detecting and preventing terrorist groups from using their services.

Daphne Keller recently spoke to The New Yorker about the implications of those two cases and pointed out that these recommendations aren’t just about your next favorite beauty vlogger. It could be one person’s post about sexual harassment or a video of a police shooting that goes viral and sparks a global social movement.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer