As you’ve probably heard, Facebook has an oversight board. Last week, it upheld the company’s ban of former President Donald Trump, with caveats. The board, created by Facebook and paid for by Facebook, has received such attention and been treated so seriously, that it made me wonder: Is this something other tech companies are going to try?
I spoke with Marietje Schaake, the president of the CyberPeace Institute, and who helped found a group of experts that calls itself the Real Facebook Oversight Board. She says the actual board misses the big picture. The following is an edited transcript of our conversation.
Marietje Schaake: There’s global focus on what this small group of people has to say about Donald Trump’s account, while the harms from Facebook come from its business model — from a number of aspects of the business model that the Oversight Board has no authority to talk about. So algorithmic amplification, groups, advertisements, they’re all outside of the remit of the Facebook Oversight Board. And so it doesn’t actually address the problem. It distracts from a discussion about what is needed to address the problem.
Molly Wood: But I do wonder, do you think other companies might embrace this model? Is it either a nice buffer or something that could potentially deal with such a big, thorny issue?
Schaake: Well, it’s a trend indeed, that we see, especially Big Tech companies embracing. For example, Microsoft has opened a mission to the United Nations, a sort of embassy that normally countries open to engage in diplomatic relations. So I think, indeed, these Big Tech companies are very happy to set up their own type of governance models to either pretend that there’s oversight or that there’s rules that are followed. But in a democracy, that’s not a desirable phenomenon. You really want to have publicly, legitimate and also accountable rules of the road.
Wood: It sounds like you’re saying there’s no need for a board at all. There’s just a need for rules that everybody can review and understand.
Schaake: Well, I mean, you could always invite some extra eyes to critically look at your business model. And I think one question is whether Facebook is going to actually deal with the recommendations that its Oversight Board has asked for. For example, the members of the Oversight Board had asked a number of questions that a number of them were not answered by Facebook. So I don’t even know if Facebook, the company, really knows what it wants with the Oversight Board. In any case, it’s never a full solution. I do believe it helps to have multiple eyes on difficult processes. But we shouldn’t pretend that this is an actual court, or let alone a Supreme Court, as Mark Zuckerberg once called it. And I worry when I hear people giving this process that’s self-regulatory so much weight.
Wood: I mean, I feel like that is sort of the phenomenon that has to be discussed — that companies might look at this and look at this outcome and look at possibly the distance that was created between Facebook and its own decision and say, “This seems great. We want to do the same thing.”
Schaake: Absolutely, because it’s a great deflector. So if a board upholds or supports a decision, the company can pretend that it’s a more legitimate process. If it revokes or challenges the decision, the company can say, “Well, we’ve been scrutinized,” even if this is still all a self-organized process that I’m sure Facebook would not have engaged in if it didn’t see benefits, PR or otherwise.
Wood: Could there be a future in which oversight bodies like this exist, but are themselves regulated and have standards and are a little bit more binding or real?
Schaake: Well, I would say for that, we would need to look at the regulatory bodies that already exist — a [Federal Trade Commission], a [Federal Communications Commission]. Hopefully, a data-protection regulator in the United States with a federal data-protection law that it would oversee. The threshold should be the law that comes about or is adjusted or updated, if necessary, through a democratic, legitimate process and has truly independent oversight. So I think even formalizing these self-regulatory processes distracts from the need to bring Big Tech, small tech, social media and other companies under the wings of democratic norms and rule-of-law-based processes. I think this is normal. Again, we see it in a number of other industries. There is no reason why tech that deals with very sensitive aspects of our lives — data, everything we see, manipulating our behavior, children’s sensitivities, mental health, public health, when we look at information about COVID-19 — these companies are really at the heart of our information architecture, and information, technology, our power, and it’s important that that power is balanced with counter-power oversight.
Wood: I think I hear you saying that this board wouldn’t need to exist if we have these regulatory agencies, but what they essentially do is ask companies to self-police. So if we actually had regulation in place to require transparency about this decision-making, to deal with algorithmic amplification, there wouldn’t be any need to create this kind of quasi-fictional ombuds board.
Schaake: Yes, I agree. And I think it might also help for Americans to sometimes look outside of their borders. I’m from the Netherlands, from the [European Union], myself. And there are a number of initiatives here to have [artificial intelligence] rules, rules around digital services in terms of their responsibility for the content that is on their platforms, rules in their roles as gatekeepers. Can Apple decide who it wants or doesn’t want in their store? Can I charge certain services more than others? A lot of work is being done, it’s just that the United States has been libertarian, hands-off for a long time, trusting that actually the interests of the American people in the American state were aligned with the interests of Big Tech. And I think unfortunately, there have been some hard lessons learned over the past year. There has been an online organized storming of the U.S. Capitol, a direct attack on democracy. There have been countless lies shared about the pandemic, the vaccines. The former U.S. president challenges the elections he himself ran, and while no formal institution agrees with his allegations, these are very, very serious tendencies that all play out with online megaphones. And it’s really time, I think, also for America to stand for what is precious, which is everybody’s democratic rights, public debate, a balancing of private power, private governance. And it can take inspiration both from what can go wrong in the rest of the world and what might be done well when it comes to rules in place.
Related links: More insight from Molly Wood
The actual Facebook Oversight Board told us in a statement that it wasn’t designed to solve all of Facebook’s problems, but would not shy away from holding it accountable. In a statement, the Real Facebook Oversight Board says the real real Oversight Board’s ruling proved its pointlessness.
The board’s ultimate decision was that Facebook was right to ban Trump for violating its terms of service repeatedly and inciting real-world violence, but that Facebook needs to make clearer rules and also figure out for itself whether the ban is permanent. For what it’s worth, though, Evelyn Douek has a good piece in The Atlantic saying this board is a totally made-up thing and its decision sort of puts us right back where we started. But on the other hand, at least it’s not only Mark Zuckerberg making the call. As the piece puts it, the Oversight Board, while flawed, fills “an enormous legal void.” That’s also the point Marietje is making. But the truth is that coming up with the legal framework might even be more challenging than asking this weird board to embrace its role and, like it told us in its statement, use its position from the inside to push the company even harder.
Douek writes that the board is “the worst option, except for all the rest.” The counterpoint, I think, is the one I made last week, however, and the one Marietje makes at the beginning of our interview: Facebook’s role here. This isn’t actually about speech. It’s about what happens when the speech gets airborne. Simply put: It takes two to tango. If people, including Trump, come along and post incitements to violence, lies or deliberate disinformation designed to divide, and that content is engaging and people click on it and comment on it, then Facebook’s algorithm, about which it will not answer questions even to its own so-called Supreme Court, amplifies that content. So again, it’s possible the board is a solution to a problem that’s not actually the real problem.
The future of this podcast starts with you.
Every day, Molly Wood and the “Tech” team demystify the digital economy with stories that explore more than just “Big Tech.” We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.