Rite Aid can’t use facial recognition technology for the next five years

FTC called the use of the surveillance technology 'reckless.'
Rotating black surveillance control camera indoors
Rite Aid conducted a facial recognition tech pilot program across around 200 stores between 2013 and 2020. Deposit Photos

Share

Rite Aid is banned from utilizing facial recognition programs within any of its stores for the next five years. The pharmacy retail chain agreed to the ban as part of a Federal Trade Commission settlement regarding “reckless use” of the surveillance technology which “left its customers facing humiliation and other harms,” according to Samuel Levine, Director of the FTC’s Bureau of Consumer Protection.

“Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices,” Levine continued in the FTC’s December 19 announcement.

[Related: Startup claims biometric scanning can make a ‘secure’ gun.]

According to regulators, the pharmacy chain tested a pilot program of facial identification camera systems within an estimated 200 stores between 2012 and 2020. FTC states that Rite Aid “falsely flagged the consumers as matching someone who had previously been identified as a shoplifter or other troublemaker.” While meant to deter and help prosecute instances of retail theft, the FTC documents numerous incidents in which the technology mistakenly identified customers as suspected shoplifters, resulting in unwarranted searches and even police dispatches.

In one instance, Rite Aid employees called the police on a Black customer after the system flagged their face—despite the image on file depicting a “white lady with blonde hair,” cites FTC commissioner Alvaro Bedoya in an accompanying statement. Another account involved the unwarranted search of an 11-year-old girl, leaving her “distraught.” 

“Rite Aid’s facial recognition technology was more likely to generate false positives in stores located in plurality-Black and Asian communities than in plurality-White communities,” the FTC added.

“We are pleased to reach an agreement with the FTC and put this matter behind us,” Rite Aid representatives wrote in an official statement on Tuesday. Although the company stated it respects the FTC’s inquiry and reiterated the chain’s support of protecting consumer privacy, they “fundamentally disagree with the facial recognition allegations in the agency’s complaint.”

Rite Aid also contends “only a limited number of stores” deployed technology, and says its support for the facial recognition program ended in 2020.

“It’s really good that the FTC is recognizing the dangers of facial recognition… [as well as] the problematic ways that these technologies are deployed,” says Hayley Tsukayama, Associate Director of Legislative Activism at the digital privacy advocacy group, Electronic Frontier Foundation.

Tsukayama also believes the FTC highlighting Rite Aid’s disproportionate facial scanning in nonwhite, historically over-surveilled communities underscores the need for more comprehensive data privacy regulations.

“Rite Aid was deploying this technology in… a lot of communities that are over-surveilled, historically. With all the false positives, that means that it has a really disturbing, different impact on people of color,” she says.

In addition to the five year prohibition on employing facial identification, Rite Aid must delete any collected images and photos of consumers, as well as direct any third parties to do the same. The company is also directed to investigate and respond to all consumer complaints stemming from previous false identification, as well as implement a data security program to safeguard any remaining collected consumer information it stores and potentially shares with third-party vendors.