Facebook’s oversight board is finally here. How will it affect content moderation?
Share Now on:
Facebook now has an oversight board that will have final say over what content stays or goes on the social media platform. Its 20 members will serve as a high court of sorts to weigh tricky decisions of free speech versus harmful content.
Facebook committed to creating an oversight board two years ago, a day after The New York Times published an investigative report about Facebook’s handling of high-profile incidents such as Russian interference in the 2016 presidential race.
The company has been criticized for past content moderation decisions, and has denied accusations that it more heavily targets conservative-leaning posts for removal.
The oversight board is starting with 20 members and will grow to 40.
Members include a former prime minister of Denmark, a former human rights judge and a Nobel Peace Prize winner.
The group will decide some of the thorniest issues surrounding content moderation on Facebook and its sibling site Instagram, such as hate speech, harassment, safety and privacy.
The board’s four co-chairs say their decisions will be final and binding. It’s funded through a $130 million trust set up by Facebook and is supposed to operate independently.
As a nonprofit news organization, our future depends on listeners like you who believe in the power of public service journalism.
Your investment in Marketplace helps us remain paywall-free and ensures everyone has access to trustworthy, unbiased news and information, regardless of their ability to pay.
Donate today — in any amount — to become a Marketplace Investor. Now more than ever, your commitment makes a difference.