A few months ago, media critic Soraya Chemaly noticed some pretty gross stuff on people’s Facebook pages.
“The memes, the pictures — they ran the gamut from very strange humor about rape or domestic violence to very graphic images of women brutalized, bleeding,” she recalls.
Facebook didn’t put this stuff up — users did. But Facebook moderators weren’t taking it down. When Chemaly and others alerted advertisers including Nationwide Construction and Nissan, they said “until this is resolved we will not spend our money here.”
But how do ads and offensive content get paired up in the first place?
“It’s a mistake,” says Joseph Turow, professor at University of Pennsylvania’s Annenberg School for Communication.
Companies use algorithms that can categorize websites in order to target people, he says. Those algorithms can mess up, the “formulas can have glitches” or certain keywords might have caused a photo or website to get miscategorized. Voila: an ad ends up next to something awful, says Turow.
Algorithms can also miss content altogether sometimes. Especially on Facebook, where algorithms rely on what you like and what you click on, and not on what photos you happen to see while scrolling. For example, you might see a horrific post on your news feed, and Facebook might at the same time put an advertisement on the side of the screen for muffins, because it knows from you status updates that you like baking.
The content didn’t trigger the ad, your interests did. The advertiser — and you — were the collateral damage of the offensive posting.
Turow says mismatches between ads and content have “been going on for decades.”
Think about a sitcom episode where there’s an aspirin overdose — even a funny one — Bayer doesn’t want to sponsor that episode. In the world of TV, companies usually get warnings about such things. But in the world of social media, bad pairings are harder to catch.
“There are millions of status updates and tweets every day,” points out Deborah Williamson, a senior analyst with market research firm Emarketer. “Just the magnitude of the amount of information that is generated is staggering.”
The challenges Facebook and other sites are facing is “representative of the challenges of working in an environment where users are generating the bulk of information and content on a site.”
Williamson says policing this amount of information is a huge challenge — the content will keep coming, the question is will it be taken down, and how fast.
Cheers to trustworthy journalism!
Give just $7/month to get your own KaiPA glass.