Menu Close

The Computer Got It Wrong: Facial Recognition Technology and Establishing Probable Cause to Arrest


T.J. Benedict


August 8, 2022

Facial recognition technology (FRT) is a popular tool among police, who use it to identify suspects using photographs or still-images from videos. The technology is far from perfect. Recent studies highlight that many FRT systems are less effective at identifying people of color, women, older people, and children. These race, gender, and age biases arise because FRT is often “trained” using non-diverse faces. As a result, police have wrongfully arrested Black men based on mistaken FRT identifications. This Note explores the intersection of facial recognition technology and probable cause to arrest.

Courts rarely, if ever, examine FRT’s role in establishing probable cause. This Note suggests a framework for how courts can evaluate FRT and probable cause. Case law about drug-sniffing dogs provides a starting point for assessing what role an FRT identification should play in probable cause determinations. But drug dogs are not a perfect analogue for FRT. Two important differences between these two policing tools warrant treating FRT with greater scrutiny than drug dogs. First, FRT has baked-in racial, gender, and age biases that drug dogs lack. Second, FRT is a digital policing tool, which recent Supreme Court precedent suggests merits more judicial scrutiny than non-digital police tools like dogs.

Giving FRT a closer look leads to the conclusion that an FRT identification alone is insufficient to establish probable cause. FRT relies on flawed inputs (non-diverse data) that leads to flawed outputs (demographic discrepancies in misidentifications). These problematic inputs and outputs provide complimentary reasons why an FRT identification alone cannot provide probable cause.


T.J. Benedict, The Computer Got It Wrong: Facial Recognition Technology and Establishing Probable Cause to Arrest, 79 Wash. & Lee L. Rev. 849 (2022).