Police and private security personnel monitor security cameras at the Lower Manhattan Security Initiative on April 23, 2013 in New York City.
Police and private security personnel monitor security cameras at the Lower Manhattan Security Initiative on April 23, 2013 in New York City. - 
Listen To The Story
Marketplace

Hardly anything happens these days that isn't caught on video. Cell phones, security cameras, drones, even doorbells have cameras built in these days. All that video would seem to be evidence galore for law enforcement, except for a few problems. First: there's so much of it. Companies and law enforcement agencies are developing algorithms and machine learning to sift through video, looking for patterns or places or people. Second: that technology can have the same biases and flaws as the people who designed it. Molly Wood talks about this with Kelly Gates, associate professor at the University of California San Diego, who's studied the rise of forensic video evidence. The following is an edited transcript of their conversation.

Kelly Gates: It will become more difficult once you have these algorithmic systems to understand the forms of bias that are designed into them. It will require a lot of expertise to be able to understand how algorithms work and to be able to identify the kinds of bias that are being built in. And I think that that's going to require a lot of oversight — and technically informed oversight. 

Molly Wood: How is this kind of expectation of constant surveillance in some ways shaping the legal system? 

Gates: Well I think that there is rarely these CSI moments, for example, where technologies are applied or some kind of enhancement technique is introduced and a kind of smoking gun appears, so that the decisive evidence is discovered that solves the case. More often there's a lot of work that goes into using video from surveillance systems or from other sources to put together timelines and to establish sequences of events. And that process — there's potential there for not just outright falsification or intentional falsification of evidence, although that is a real problem, but also all kinds of implicit bias or even unconscious bias that comes from the legal system whereby, you know, forensic analysts who are doing this kind of work are working under or in very close cooperation with prosecutors. So there's a lot of need to resist the temptation, in other words, to just simply find exactly what is needed to gain a conviction.

Wood: I imagine there's a private video economy developing here, right? What can you tell us about the companies that are working on this?

Gates: This is not the exclusive domain of law enforcement agencies. There are specialized companies for hire that do this kind of work. Companies like Axon, which is formerly Taser, which offers a suite of video forensics tools that it offers law enforcement customers. And I think, again, there's a real need to see that these companies make the technologies that they're developing transparent, because these are technologies being used for our legal system.

“I think the best compliment I can give is not to say how much your programs have taught me (a ton), but how much Marketplace has motivated me to go out and teach myself.” – Michael in Arlington, VA

As a nonprofit news organization, what matters to us is the same thing that matters to you: being a source for trustworthy, independent news that makes people smarter about business and the economy. So if Marketplace has helped you understand the economy better, make more informed financial decisions or just encouraged you to think differently, we’re asking you to give a little something back.

Become a Marketplace Investor today – in whatever amount is right for you – and keep public service journalism strong. We’re grateful for your support.

Follow Molly Wood at @mollywood