To look at media coverage this summer is to witness a season of violence in the real world. A plane shot down in Ukraine, the suicide of Robin Williams, the Ebola outbreak that is now an International Public Health Emergency, everything that's happening in Ferguson, Missouri and the rise of ISIS are just a few of the stories on the list. Even for those of us who see news events blend together in a seemingly endless daily grind, it's tough to watch. A friend and news junkie told me this week he had "no stomach" for the story of beheaded photojournalist James Foley. Yesterday I was reading statements from Foley's family saying how proud they were of him and the work he did, and I started to cry. That is not something I do very often.
It is cliche at this point to say that "the 24-hour news cycle" and "the age of the Internet" contribute to our perception about the volume and intensity of this violence, but it is also true. Just this week I interviewed David Carr on how tweets about Ferguson influenced coverage of the events there in a big way. Twitter is an especially compelling tool for the news media, in part because of its chronological design--something Zach Seward at Quartz just used to compare it to regular television. But culling social media for news is tricky, because news is filtered and social media is not. After family members had to beg users to stop sharing James Foley's beheading video, and following Zelda Williams quitting Twitter because of people assaulting her with violent imagery following her dad's death, the social network has announced this week that it will respect takedown requests on a case-by-case basis.
Twitter and other companies might need to do more than that. For Thursday's show we spoke with USC Professor Karen North about an interesting problem: how algorithms can reward the content we don't want to like, upvote, or even share. U.S.-born tech companies like Twitter, Youtube, and Facebook have unique challenges when it comes to takedown requests, because they don't want to censor content that should be allowed to exist. But an algorithm used to measure and organize the stuff that comes in can give points for all kinds of engagement. If you're giving a YouTube video a thumbs down, you're still engaging with the content, which might improve that content's standing. Watching it all the way through might help, too.
It's important to note that this depends a lot on the company you're talking about and the algorithm you're talking about. Algorithms, like Google's mysterious search algorithm, are being changed all the time, too. But as Jon Lee Anderson writes in The New Yorker, "there is no longer any doubt that the Internet, with its power of contagion and usefulness for recruiting, has become a preferred, particular tool of terrorists." When you put that next to the possibility that our online behavior, even when we don't want it to, might be rewarding the content of terrorists, you don't feel so good about the Internet's impact on world events. As users, we should be thinking about how our online behavior, however passive, can have an impact. Tech companies should also keep thinking about the impact of not just their takedown policies, but the tools they use to comb through and curate the content they're hosting.