Facebook’s oversight board is finally here. How will it affect content moderation?
Share Now on:
Facebook now has an oversight board that will have final say over what content stays or goes on the social media platform. Its 20 members will serve as a high court of sorts to weigh tricky decisions of free speech versus harmful content.
Facebook committed to creating an oversight board two years ago, a day after The New York Times published an investigative report about Facebook’s handling of high-profile incidents such as Russian interference in the 2016 presidential race.
The company has been criticized for past content moderation decisions, and has denied accusations that it more heavily targets conservative-leaning posts for removal.
The oversight board is starting with 20 members and will grow to 40.
Members include a former prime minister of Denmark, a former human rights judge and a Nobel Peace Prize winner.
The group will decide some of the thorniest issues surrounding content moderation on Facebook and its sibling site Instagram, such as hate speech, harassment, safety and privacy.
The board’s four co-chairs say their decisions will be final and binding. It’s funded through a $130 million trust set up by Facebook and is supposed to operate independently.
We’re here to help you navigate this changed world and economy.
Our mission at Marketplace is to raise the economic intelligence of the country. It’s a tough task, but it’s never been more important.
In the past year, we’ve seen record unemployment, stimulus bills, and reddit users influencing the stock market. Marketplace helps you understand it all, will fact-based, approachable, and unbiased reporting.
Generous support from listeners and readers is what powers our nonprofit news—and your donation today will help provide this essential service. For just $5/month, you can sustain independent journalism that keeps you and thousands of others informed.