SHARE

Facial recognition programs have a long, troubling history of producing false matches, particularly for nonwhite populations. A recent such case involves a woman who was eight months’ pregnant at the time of her arrest. According to The New York Times, Detroit Police Department officers reportedly arrested and detained Porcha Woodruff for over 11 hours because of a robbery and carjacking she did not commit.

The incident in question occurred on February 16, and attorneys for Woodruff filed a lawsuit against the city of Detroit on August 3. Despite Woodruff being visibly pregnant and arguing she could not have physically committed the crimes in question, six police officers were involved in handcuffing Woodruff in front of neighbors and two of her children, then detaining her while also seizing her iPhone as part of an evidence search. The woman in the footage of the robbery taken on January 29 was visibly not pregnant.

[Related: Meta attempts a new, more ‘inclusive’ AI training dataset.]

Woodruff was released on a $100,000 personal bond later that night and her charges were dismissed by a judge less than a month later due to “insufficient evidence,” according to the lawsuit.

The impacts of the police’s reliance on much-maligned facial recognition software extended far beyond that evening. Woodruff reportedly suffered contractions and back spasms, and needed to receive intravenous fluids at a local hospital due to dehydration after finally leaving the precinct. 

“It’s deeply concerning that the Detroit Police Department knows the devastating consequences of using flawed facial recognition technology as the basis for someone’s arrest and continues to rely on it anyway,” Phil Mayor, senior staff attorney at ACLU of Michigan, said in a statement.

According to the ACLU, Woodruff is the sixth known person to report being falsely accused of a crime by police due to facial recognition inaccuracies—in each instance, the wrongly accused person was Black. Woodruff is the first woman to step forward with such an experience. Mayor’s chapter of the ACLU is also representing a man suing Detroit’s police department for a similar incident from 2020 involving facial recognition biases. This is reportedly the third wrongful arrest allegation tied to the DPD in as many years.

[Related: Deepfake audio already fools people nearly 25 percent of the time.]

“As Ms. Woodruff’s horrifying experience illustrates, the Department’s use of this technology must end,” Mayor continued. “Furthermore, the DPD continues to hide its abuses of this technology, forcing people whose rights have been violated to expose its wrongdoing case by case.” In a statement, DPD police chief James E. White wrote that, “We are taking this matter very seriously, but we cannot comment further at this time due to the need for additional investigation.”

Similarly biased facial scan results aren’t limited to law enforcement. In 2021, employees at a local roller skating rink in Detroit used the technology to misidentify a Black teenager as someone previously banned from the establishment. Elsewhere, public housing officials are using facial ID technology to surveil and evict residents with little-to-no oversight.