Hardly anything happens these days that isn’t caught on video. Cell phones, security cameras, drones, even doorbells have cameras built in these days. All of that video would seem to be evidence galore for law enforcement, except for a few problems. First: there’s so much of it. Companies and law enforcement agencies are developing algorithms and machine learning to sift through all that video, looking for patterns or places or people. Second: that technology can have all the same biases and flaws as the people who designed it. Molly Wood talks about this with Kelly Gates, associate professor at the University of California San Diego, who’s studied the rise of forensic video evidence. (1/7/19)
Tech can sift through video evidence…but can it avoid bias?

Police and private security personnel monitor security cameras at the Lower Manhattan Security Initiative on April 23, 2013 in New York City. John Moore/Getty Images