❗Help close the gap: We still need to raise $40,000 by the end of March. Donate now
Could a digital New Deal rewrite tech policy?
Sep 10, 2020

Could a digital New Deal rewrite tech policy?

HTML EMBED:
COPY
One proposal would require social media platforms to check content as it's beginning to go viral to see if it would spread disinformation.

At this point, consumers, tech employees, even the CEOs of some big tech companies say there should be more regulation around online privacy, advertising and even disinformation. But what might that regulation look like?

The German Marshall Fund think tank is pushing for an initiative called the Digital New Deal. It contains a bunch of policy proposals and would ideally create more transparency into how tech companies operate and question the incentives that push disinformation. Karen Kornbluh, the director of the Digital Innovation and Democracy Initiative at the German Marshall Fund, spoke to me about it. The following is an edited transcript of our conversation.

Karen Kornbluh.

Karen Kornbluh: One of the specific proposals we have … is a kind of circuit breaker that they have for high-speed trading on Wall Street, where when things get too heated and are spreading too quickly, the platforms have to take a pause and take a look and see if it violates their terms of service. [Facebook] didn’t do this with a [Breitbart] video that was spreading conspiracy theories about how dangerous masks are and who’s spreading COVID, [depicting people calling themselves] America’s Frontline Doctors. It was seen by 20 million separate views on Facebook before they realized they had to shut it down.

Molly Wood: That’s so interesting, because so much of this has been built on the idea that virality is good, that something going viral is good. And so you’re saying that these platforms should start to change their thinking so that when something’s going viral, they are alarmed?

Kornbluh: I think we’ve all taken another look at that word viral lately. Viral can be bad, but viral can be good for platforms because it means that people are excited and they’re staying online. And that means that the platforms can show you ads, and that’s how they make money. So their incentives differ from our incentives, which is that we want to critically examine that piece of content that’s coming across our desk before we share it with our unwitting grandmother who may take action based on it.

Wood: What is the reception to the idea of this level of regulation? I think it’s clear even from Mark Zuckerberg’s interview just this week, that [the companies] don’t agree.

Kornbluh: You know, he talked about anti-vax and that he doesn’t want to suppress speech about vaccine skepticism. But if you look at social media messages urging Americans to reject vaccines, it’s tripled just since the pandemic has begun. And again, he’s focused on the content and not enough on the systems that create the opportunity for conspiracy theorists to play the algorithm or a bunch of groups to promote anti-vaccines. And that’s because he doesn’t have the incentive. Car companies, they may have resisted seat belts or air bags or fuel efficiency, but then once the policies were put in place, they turned on the innovation and they figured out how they could still make money, but also keep people safe.

Wood: A lot of the time, the pushback argument for not regulating social media platforms, for example, is free speech. But it’s also based on this idea that disinformation is just as old as time. “People gonna people! That’s what people do, they argue or they try to convince each other in one direction or the other.” But it seems like increasingly, we’re realizing that the platforms themselves, not only is there an incentive for them to serve this content, but they exist as an incentive to create it. There’s money to be made; influencers are selling merch.

Kornbluh: That’s absolutely right, and a lot of people miss that. If I have an outlet — we call them Trojan Horse outlets that pretend to be news outlets but are just repackaging old rumors and making them look like news, or if you’re one of these carnival barker pages that tries to get people’s attention on those outlets — if you get enough eyeballs, the platforms serve up ads and you get some of that money. If you’re a YouTube channel that spews out a lot of disinformation and so people come to you, you can wind up with revenue sharing from the platform. So absolutely. And then there are a bunch of people that flat out sell fraudulent products as part of this disinformation scheme. So they may say, “Here’s something that works better than a mask to cure COVID, and buy it here. Here’s something that will get you rich quickly.” So there is an entire financial ecosystem that’s supporting this disinformation.

Wood: In a way, it’s not like the platforms just amplify and themselves profit from disinformation, they create an incentive cycle and a financial encouragement for people to create and spread disinformation.

Kornbluh: The one other thing that I would add to that is the Federal Trade Commission has started to do work in this, but they could use a lot more expertise, a lot more authority and a lot more resources to go after this kind of activity. And the other thing that I would say to add to your question about “aren’t people just going to spread rumors?” is that one of the things that makes the internet more effective at spreading rumors is this information laundering. People don’t know where the information is coming from. It looks like it’s coming from a reputable news organization, it looks like it’s coming from a neighbor. Because people aren’t aware and the platforms aren’t transparent enough. So a big part of this could be handled with transparency, which is very free-speech friendly.

Related links: More insight from Molly Wood

Fortune has a story from last week about how the majority of tech employees and would-be tech founders believe there should be more regulation in the industry.

But big tech companies are already fighting against a bill introduced Tuesday, with only GOP sponsors, that would undo some of the liability protections that are given by Section 230. However, the bill is based on the premise that social media platforms are unfairly censoring conservative content, and it would force platforms to be more specific about the types of content they remove online. It can’t be just objectionable, for example, it would have to be unlawful or “promoting terrorism.”

No Democrats co-sponsored the bill, and let’s be honest, as written, it seems like it would create a lot of room to spread misinformation as long as that misinformation could be characterized as what legislators described as “valid political speech.” So that one is probably not going anywhere.

One more reminder, by the way, that social media companies are certainly not censoring conservative speech. Not only is there no proof of that censorship, you can go follow Facebook’s Top 10 on Twitter, an automated list of the top-performing links by U.S. Facebook pages every single day. You’ll see that conservative news, with an occasional splash of K-pop, is doing just fine.

But in news of some actual regulation, The Wall Street Journal reports that European Union privacy regulators are preparing to order Facebook to stop sending any data about people in the EU back to the United States. That would be a big deal. It stems from a ruling in July in which regulators found that European users have no way to challenge American government surveillance, and therefore their data just isn’t safe with us.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer