Rite Aid "covert surveillance program" falsely ID'd customers as shoplifters, FTC says

Rite Aid is banned from using facial recognition surveillance technology for five years to settle Federal Trade Commission charges that it failed to protect consumers in hundreds of its stores, the agency said Tuesday.

Rite Aid used a "covert surveillance program" based on AI to ID potential shoplifters from 2012 to 2020, the FTC said in a complaint filed in the U.S. District Court for the Eastern District of Pennsylvania. Based on the faulty system, the pharmacy chain's workers erroneously accused customers of wrongdoing in front of friends and relatives, in some cases searching them, ordering them to leave the store or reporting them to the police, according to the complaint. 

According to the FTC, the retailer hired two companies to help create a database of tens of thousands of images of people that Rite Aid believed had committed crimes or intended to at one of its locations. Collected from security cameras, employee phone cameras and even news stories, many of the images were of poor quality, with the system generating thousands of false positives, the FTC alleges.

Rite Aid failed to test the system for accuracy, and deployed the technology even though the vendor expressly stated it couldn't vouch for its reliability, according to the agency.

Preventing the misuse of biometric information is a high priority for the FTC, the agency said in its statement. 

"Rite Aid's reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers' sensitive information at risk," said Samuel Levine, Director of the FTC's Bureau of Consumer Protection. "Today's groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices."

11-year-old girl searched by Rite Aid employee

During one five-day period, Rite Aid generated more than 900 separate alerts in more than 130 stores from New York to Seattle, all claiming to match one single person in its database. "Put another way, Rite Aid's facial recognition technology told employees that just one pictured person had entered more than 130 Rite Aid locations from coast to coast more than 900 times in less than a week," according to an FTC blog post

In one incident, a Rite Aid worker stopped and searched an 11-year-old girl based on a false match, with the child's mother reporting having to miss work because her daughter was so distraught, the complaint stated.

Black, Asian, Latino and women consumers were at increased risk of being incorrectly matched, the FTC stated. 

Further, Rite Aid didn't tell consumers it used the technology and specifically instructed workers not to tell patrons or the media, the agency relayed.

Rite Aid said it was pleased to put the matter behind it, but disputed the allegations in the agency's complaint. 

"The allegations relate to a facial recognition technology pilot program the company deployed in a limited number of stores. Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC's investigation regarding the Company's use of the technology began," stated the retailer, which is in bankruptcy court and currently restructuring. 

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.