Thirty-six civil rights organizations have teamed up to stand against the use of facial recognition technology in stores.

An alliance led by civil rights group Fight for the Future wants stores to stop using facial recognition technology, and it’s calling out offenders by name.

More than 35 groups — including Access Now, Data for Black Lives, Surveillance Technology Oversight Project and others — joined forces last month for “Ban Facial Recognition in Stores,” a campaign that demands that retailers end the practice. Its latest scorecard aims to pressure Macy’s, Apple, Lowe’s and grocery stores H.E.B Grocery and Albertsons to stop scanning shoppers and workers.

“Companies say they offer facial recognition in the name of ‘convenience’ and ‘personalization,’ but their real priorities are protecting and predicting their profits, ignoring how they abuse people’s rights and put them in danger,” said Caitlin Seeley George, campaign director at Fight for the Future, in prepared remarks.

“The stores that are using or are considering using facial recognition should pay attention to this call from dozens of leading civil rights and racial justice organizations who represent millions of people,” she continued. “Retailers should commit to not using facial recognition in their stores so we can champion their decision, or be prepared for an onslaught of opposition.”

According to the coalition, tech-makers accelerated the use of facial recognition during the pandemic, framing it as helpful for things like contactless payments or taking temperatures. It also pointed out that a recent offering from payments company Stripe radically simplifies its addition to websites with just three lines of code.

But it’s clear that the pressure campaign is on the retail sector.

The groups researched or contacted the large U.S. retailers to ask if they use facial recognition tech, then created a scorecard based on the results, covering 29 companies and counting.

“We do plan to expand this list of retailers to include more of the top retailers in the country, so that people can contact these additional companies and pressure them to put their customers’ and workers’ rights, privacy and safety first,” Seeley George said in a statement to WWD.

Notably Walmart and Target are listed as “Won’t Use,” while Macy’s, Apple and the others listed above are logged as “Are Using.”

Some of the stores were tagged because of lawsuits. Macy’s was flagged due to a class-action privacy lawsuit in Chicago last year, which accused the company of violating Illinois’ biometric privacy law by using related software to scan shoppers. Likewise, the coalition noted that “Lowe’s is being sued for use of facial recognition in Illinois” and cited its privacy policy, which states that it “may also use video monitoring and other tracking technologies” for security, theft protection, other crimes and “to monitor in-store traffic patterns, customer counts and interests, and perform similar analytics. However, we may or may not be able to associate such information with you.”

But Maureen Wallace, manager of corporate communications and PR for Lowe’s, denies that the retailer uses the tech. “In fact, Lowe’s does not collect biometric or facial recognition data in our stores,” she told WWD.

Apple’s status stems from yet another lawsuit, in which someone claimed that faulty facial recognition led to misidentification as a shoplifter, not once but multiple times.

The latter scenario is one of the reasons critics blast the tech, which can have an outsized impact on people of color and other marginalized communities.

“Detroiters know what it feels like to be watched, to be followed around by surveillance cameras using facial recognition,” said Tawana Petty, national organizing director at Data for Black Lives, in the announcement.

Petty explained that Project Green Light, a mass surveillance program in Detroit, uses more than 2,000 flashing green surveillance cameras at more than 700 businesses, including hospitals, public housing and restaurants. The cameras use facial recognition and connect to crime centers, police precincts and on officers’ mobile devices 24/7.

“It’s difficult to explain the psychological toll it takes on a community, knowing that your every move is being monitored by a racially biased algorithm with the power to yank your freedom away from you,” she added.

Bias is a deeply thorny issue in the tech sector. Smartwatches, for instance, have had a troubled history accurately reading darker skin tones for sensor-driven health or fitness features, while the proposition of bias in artificial intelligence has throngs of corporate committees, action groups and legislators examining the issue.

As for the scorecard, one retail giant noticeably absent is Amazon, whose Amazon Go stores are studded with cameras and sensors.

“Amazon is largely not a brick-and-mortar retailer, [so] we didn’t include them on the list, despite the fact that the company is one of the largest retailers in the U.S.,” Seeley George added. “That said, we have and will continue to fight against Amazon’s use of surveillance technology, including facial recognition, in their stores, warehouses, on workers, and that they sell.”

The e-tailer has denied using facial recognition technology at Amazon Go locations in numerous comments to the media over the years. But its use of palm scanning technology kicks up similar privacy concerns, as do other initiatives.

In particular, Fight for the Future protests Amazon’s partnership with law enforcement. “In over 1,400 cities across the country, Amazon provides police departments with warrantless access to request and store footage from thousands of Ring cameras,” its website states. “This surveillance en masse gives cops unprecedented power.” The group aims to create awareness and put public pressure on local elected officials to pull out of these partnerships.

It’s a different approach compared to that of the alliance, which isn’t putting its faith in the actions of politicians alone.

Seeley George explained that tech companies themselves have been calling for regulation, though industry-friendly versions. “[They] don’t do enough to keep people safe,” she said. For instance, requiring consent to collect biometric data assumes that people understand “the full harms” involved, and most don’t. Some consumers also may not have a choice of shopping elsewhere, depending on their location or income.

“There isn’t a way to put regulations in place that put privacy and rights first when the technology is inherently invasive,” she said. “This is why we’re calling on retailers to commit to not using facial recognition technology, and are also calling on lawmakers to pass legislation to protect people from companies that are using facial recognition.”