Are platforms liable for user content? Supreme Court may reset the rules.
Oct 7, 2022

Are platforms liable for user content? Supreme Court may reset the rules.

HTML EMBED:
COPY
SCOTUS is considering two cases tied to Section 230 of the Communications Decency Act, a law that has shaped how online platforms curate their content.

Much of the social media ecosystem — love it or hate it — has been made possible by a federal law from 1996.

It’s called the Communications Decency Act. Section 230 of that law shields online publishers like Facebook, Twitter and YouTube from liability for much of the content posted on their platforms.

This week, the Supreme Court announced it will hear challenges to that law. One of the cases — Reynaldo Gonzalez v. Google LLC — questions whether Section 230 protects platforms that use algorithms to recommend content to users.

Marketplace’s Meghan McCarty Carino spoke with Eric Goldman, a law professor at Santa Clara University School of Law. He said there are a few ways the decision could go.

Below is an edited transcript of their conversation.

Eric Goldman: Algorithmic recommendations is like a fancy way of saying that the services are prioritizing and curating content for their audience. The way that the plaintiffs have framed the Gonzalez v. Google case invites the court to potentially do one of three things. They could say that algorithmic recommendations are just like the other kinds of services that the online social media providers or other publishers provide, and therefore it’s covered by Section 230. It also invites a court to say none of that is covered by Section 230. Or the third situation, they could say some of the things that the social media services do qualify for Section 230, but algorithmic recommendations don’t.

Meghan McCarty Carino: And what is at stake here? How could a ruling change how we interact with the Internet?

Goldman: One possibility is that the Supreme Court will say any recommendations that are done by social media services drop out of Section 230. Then it would say that services can host content without fear and liability for the, what the users are saying, but they can’t promote or curate that content. And the internet would look a lot like a Google Drive or a Dropbox at that point. Someone who wants to share content online can upload it, and then the services stop. And at that point, individuals are responsible for trying to find their own audience.

McCarty Carino: And what could that mean for the business of tech companies? I mean, so many have really relied on algorithmic recommendations for their business model.

Goldman: I think what’s likely to happen if the Supreme Court could circumscribe Section 230 is that some of the businesses will migrate towards more professionally produced content. Instead of allowing users to talk to each other, the services will pick a small number of voices, they will pay them to create the content and then they’ll share that with their audience. And the most likely way we’ll pay for it is with paywalls. And that will exacerbate the digital divides that already exist.

Eric Goldman follows the hundreds of cases that involve Section 230 in some way, and you can read his thoughts on several of those cases on his blog.

Now, I said the Supreme Court is considering multiple challenges to Section 230. Another case — Twitter v. Taamneh— questions whether online platforms should be held responsible for not detecting and preventing terrorist groups from using their services.

Twitter is appealing a 9th Circuit court decision that said it should be held liable.

Professor Goldman pointed out that Section 230 protections already exclude material related to federal criminal investigations, which presumably could include terrorism.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daniel Shin Producer
Jesús Alvarado Associate Producer