❗Let's close the gap: We still need your help to raise $40,000 by April 1. Donate now
Marketplace Tech Blogs

Tech can sift through video evidence … but can it avoid bias?

Molly Wood Jan 7, 2019
HTML EMBED:
COPY
Police and private security personnel monitor security cameras at the Lower Manhattan Security Initiative on April 23, 2013 in New York City. John Moore/Getty Images
Marketplace Tech Blogs

Tech can sift through video evidence … but can it avoid bias?

Molly Wood Jan 7, 2019
Police and private security personnel monitor security cameras at the Lower Manhattan Security Initiative on April 23, 2013 in New York City. John Moore/Getty Images
HTML EMBED:
COPY

Hardly anything happens these days that isn’t caught on video. Cell phones, security cameras, drones, even doorbells have cameras built in these days. All that video would seem to be evidence galore for law enforcement, except for a few problems. First: there’s so much of it. Companies and law enforcement agencies are developing algorithms and machine learning to sift through video, looking for patterns or places or people. Second: that technology can have the same biases and flaws as the people who designed it. Molly Wood talks about this with Kelly Gates, associate professor at the University of California San Diego, who’s studied the rise of forensic video evidence. The following is an edited transcript of their conversation.

Kelly Gates: It will become more difficult once you have these algorithmic systems to understand the forms of bias that are designed into them. It will require a lot of expertise to be able to understand how algorithms work and to be able to identify the kinds of bias that are being built in. And I think that that’s going to require a lot of oversight — and technically informed oversight. 

Molly Wood: How is this kind of expectation of constant surveillance in some ways shaping the legal system? 

Gates: Well I think that there is rarely these CSI moments, for example, where technologies are applied or some kind of enhancement technique is introduced and a kind of smoking gun appears, so that the decisive evidence is discovered that solves the case. More often there’s a lot of work that goes into using video from surveillance systems or from other sources to put together timelines and to establish sequences of events. And that process — there’s potential there for not just outright falsification or intentional falsification of evidence, although that is a real problem, but also all kinds of implicit bias or even unconscious bias that comes from the legal system whereby, you know, forensic analysts who are doing this kind of work are working under or in very close cooperation with prosecutors. So there’s a lot of need to resist the temptation, in other words, to just simply find exactly what is needed to gain a conviction.

Wood: I imagine there’s a private video economy developing here, right? What can you tell us about the companies that are working on this?

Gates: This is not the exclusive domain of law enforcement agencies. There are specialized companies for hire that do this kind of work. Companies like Axon, which is formerly Taser, which offers a suite of video forensics tools that it offers law enforcement customers. And I think, again, there’s a real need to see that these companies make the technologies that they’re developing transparent, because these are technologies being used for our legal system.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.