Facebook’s oversight board is finally here. How will it affect content moderation?
Share Now on:
Facebook now has an oversight board that will have final say over what content stays or goes on the social media platform. Its 20 members will serve as a high court of sorts to weigh tricky decisions of free speech versus harmful content.
Facebook committed to creating an oversight board two years ago, a day after The New York Times published an investigative report about Facebook’s handling of high-profile incidents such as Russian interference in the 2016 presidential race.
The company has been criticized for past content moderation decisions, and has denied accusations that it more heavily targets conservative-leaning posts for removal.
The oversight board is starting with 20 members and will grow to 40.
Members include a former prime minister of Denmark, a former human rights judge and a Nobel Peace Prize winner.
The group will decide some of the thorniest issues surrounding content moderation on Facebook and its sibling site Instagram, such as hate speech, harassment, safety and privacy.
The board’s four co-chairs say their decisions will be final and binding. It’s funded through a $130 million trust set up by Facebook and is supposed to operate independently.
Marketplace is on a mission.
We believe Main Street matters as much as Wall Street, economic news is made relevant and real through human stories, and a touch of humor helps enliven topics you might typically find…well, dull.
Through the signature style that only Marketplace can deliver, we’re on a mission to raise the economic intelligence of the country—but we don’t do it alone. We count on listeners and readers like you to keep this public service free and accessible to all. Will you become a partner in our mission today?