When Facial Recognition Tech Is Wrong

Like a lot of tech solutions to complex problems, facial recognition algorithms aren't perfect. But when the technology is used to identify suspects in criminal cases, those flaws in the system can have catastrophic, life-changing consequences. People can get wrongly identified, arrested, and convicted, often without ever being told they were ID’d by a computer. It’s especially troubling when you consider false identifications disproportionately affect women, young people, and people with dark skin—basically everyone other than white men. This week on Gadget Lab, WIRED senior writer Khari Johnson joins us to talk about the limits of facial recognition tech, and what happens to the people who get misidentified. Show Notes:  Read Khari’s stories about how facial recognition tech has led to wrongful arrests that derailed people’s lives. Here’s Lauren’s story about Garmin’s Fenix smartwatch. (And here’s WIRED’s review of the latest model.) Arielle’s story about the wave of shows about Silicon Valley tech founders is here. Recommendations:  Khari recommends hoagies. Lauren recommends Garmin smartwatches. Mike recommends the show The Dropout on Hulu. Khari Johnson can be found on Twitter @kharijohnson. Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

Om Podcasten

Welcome to Uncanny Valley—an insider look at the people, power, and influence of Silicon Valley– where each week, WIRED’s writers and editors bring you original reporting and analysis about some of the biggest stories in tech. On Tuesdays, WIRED’s Zoë Schiffer has an urgent conversation about this week in the news. And on Thursdays, WIRED’s Global Editorial Director Katie Drummond is joined by Lauren Goode, Michael Calore, to break down a recent story or phenomena bubbling up in Silicon Valley and explain its influence on our daily lives.