Social networking sites like Facebook and YouTube have a problem. In a way, it's a good one to have.
The grandma problem.
As social networking has gone mainstream – in other words, "even Grandma is on Facebook" – the seedier side of the web becomes a bigger and bigger problem. Say Grandma logs in to check out new family photos or videos, and then she's bombarded with everything from violent car crashes to the most vile kinds of pornography? Not a good user retention strategy.
Enter the content moderator. She makes sure the really icky stuff the Internet has to offer doesn't show up next to photos of the grand kids. She is part of a massive workforce, which one expert estimates at over 100,000 around the world. Or, 14 times the size of Facebook.
Adrian Chen wrote about content moderators for the November issue of Wired. His reporting took him to the Philippines, where outsourcing firms pay content moderators as little as $300 per month.
"What the companies told me was that people in the Philippines, because of the cultural connection to the U.S., were better-equipped to screen content for American and Western audiences," Chen said.
But no content moderator is well-equipped for the volume of vile content that the screening process entails.
"People get a darker view of humanity," Chen said, adding, "seeing all this abnormal stuff all day gives you a twisted view of what's really going on out there."
The full article, including accounts of some of the terrifying content that moderators see, is at Wired.com.
“I think the best compliment I can give is not to say how much your programs have taught me (a ton), but how much Marketplace has motivated me to go out and teach myself.” – Michael in Arlington, VABEFORE YOU GO