Parler lets anything happen on its platform — what if nobody else cares?
Nov 20, 2020

Parler lets anything happen on its platform — what if nobody else cares?

HTML EMBED:
COPY
After the presidential election, Parler saw a spike in users.

The social media site Parler doesn’t fact-check, doesn’t moderate and doesn’t label or remove misinformation. Conservatives and far-right conservatives love it, and disinformation researchers are worried. But there is one other interesting element to Parler: There’s no algorithm that amplifies stories, like the kind that tends to make disinformation go viral on YouTube or Facebook. So, could that lessen disinformation’s impact?

It’s a question for “Quality Assurance,” where I take a second look at a big tech story. I spoke with Shannon McGregor, a professor studying social media at the University of North Carolina, Chapel Hill. She’s not totally buying my premise. The following is an edited transcript of our conversation.

A photo of Shannon McGregor, a professor and senior researcher at  the Center for Information, Technology, and Public Life at the University of North Carolina, Chapel Hill.
Shannon McGregor. (Photo courtesy of Steven Vargo)

Shannon McGregor: Disinformation that comes especially from very popular social media users like, say, President Trump gets amplified, not only out to his millions of followers but through the algorithms on most social media platforms because it gets engagement. So in theory, if there’s no amplification, then maybe there’s not a problem. Except if you allow anything on your website. And so I think there’s a toxic mix between the idea of having no moderation of the content at all, and then no amplification, because then there’s just going to be a lot of bad stuff. And it might not be amplified, but there’s no incentive for something to be good quality information or engaging information.

Molly Wood: What if the disinformation, though, doesn’t go anywhere? I mean, we know that Facebook and YouTube and occasionally Twitter will push information on you that’s extreme. And so if you don’t have those algorithms, and you just have a little sandbox of toxicity, but it doesn’t actually make it out into the larger world, how much harm can it do?

McGregor: I just think it would still make it out into the larger world. What we’ve seen with disinformation is that it starts in a place like Parler, Gab or 8chan, and then it gets a little bit less wild and a little bit more acceptable version of it might show up on a place like Reddit or on Instagram. And then from there, it makes its way on to the super mainstream Facebook or Twitter, and then it ends up in some version in the news or on the cable broadcast that night.

Wood: I also want to ask you about this idea of light or no moderation, because that is the promise now. No platform has ever been able to pull that off. How long can Parler really be unmoderated?

McGregor: Probably not that long. And I think we’re most concerned about it from the political standpoint in this transition period [when] we have an outgoing and incoming president and people are feeling really uncertain. They’re flocking, obviously, conservatives to an app like Parler, where they’re not going to feel that they’re being moderated — they can sort of say whatever they want. But I don’t think Parler is an app that’s going to become the new Facebook or anything. It’s going to be a really toxic place. And these platforms or these forums that aren’t moderated at all have a tendency to sort of snowball into that extremity. 

Wood: Right. But it sounds like what you’re saying is, that’s bad for business. 

McGregor: Yeah, because most people don’t want that. There’s certainly a subset of people who are into that, who have these really radical views, but I think if we look at that — I think the latest numbers are 10 million downloads in the last couple of weeks. I don’t think that we have 10 million violent, radical conservatives. I think there’s a lot of people downloading it who are curious, and frankly are not likely to stay on the app, whether it’s because they’re turned off by the extreme behavior, or also if you spent any time on it, because there’s no amplification and there’s no sort of sorting, there’s also a lot of grift. Like there’s a lot of things like selling herbal supplements and things like that. And so no one likes that. Algorithmic news feeds, get rid of that stuff.

Wood: So should we just not worry that much about Parler?

McGregor: I think we should worry about it right now. Like I said, I think we’re in this uncertain political moment in our country. And I think that unmoderated misinformation that we know is stoked by violent and hateful groups, that’s worrisome. But I don’t think it’s Parler that’s the problem, it’s our politics right now that’s the problem. And so in this moment, it’s being reflected by conservatives flocking to an app where there’s a promise of no moderation, because they feel like they’re being moderated in a biased way on other platforms. That’s just more reflective, I think, of our political moment than of some new frontier of zero-moderation social media in our future.

This illustration photo shows a mobile phone placed on a US flag with Tweets from US President Donald Trump masked with warnings imposed by Twitter stating that they may be incorrect, November 5, 2020 as vote counting continues to determine the winner of the presidential election.
Tweets from President Donald Trump are masked with warnings imposed by Twitter stating that they may be incorrect, as vote counting continues to determine the winner of the presidential election on Nov. 5, 2020. (Robyn Beck/AFP via Getty Images)

Related links: More insight from Molly Wood

There is more reading on this topic where kind of both of these ideas are represented. One, that Parler is right now a haven for hate groups and extremists. And that is potentially dangerous as it’s always been. And two, that being a haven for hate groups and extremists is kind of bad business as it’s always been. But I think there is an ongoing conversation to be had about what social media should look like going forward, because even while Twitter and Facebook are getting more aggressive about labeling or taking down content, YouTube and Facebook are still dumping extremist links and videos into your news feed, whether you want them or not. So that’s not working to contain the kind of mind virus that is disinformation and hate, either. Building immunity to that is going to be a long road. Maybe just spend the weekend not on social media at all.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Michael Lipkin Senior Producer
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer