Simon Mackenzie, a security officer at discount retailer QD Stores on the outskirts of London, was out of breath. He had just chased down three burglars who had made off with several packets of laundry soap. Before the police arrived, he sat down at a desk in the back room to do something important: capture the faces of the culprits.
On an old desktop computer, he pulled up footage from security cameras, pausing to zoom in and save a photo of each burglar. She then logged into a facial recognition program, Facewatch, which his store uses to identify shoplifters. The next time those people walk into any store within a few miles that is wearing Facewatch, the store staff will receive an alert.
“It’s like having someone say to you, ‘That person you caught last week just came back,’” Mackenzie said.
The use of facial recognition technology by police has received much scrutiny in recent years, but its application by private companies has received less attention. Now, as technology improves and its cost decreases, systems are becoming more and more involved in people’s lives. No longer just the province of government agencies, facial recognition is increasingly deployed to identify thieves, problematic customers, and legal adversaries.
Facewatch, a British company, is used by retailers across the country frustrated with petty crime. For as little as £250 a month, or about $320, Facewatch offers access to a personalized watch list that stores actions that are close to one another. When Facewatch detects a marked face, an alert is sent to a smartphone in the store, where employees decide whether to keep a close eye on the person or ask them to leave.
Mackenzie adds a new face or two every week, she said, mostly people who steal diapers, groceries, pet supplies and other low-cost goods. He said his financial hardship made him sympathetic, but the number of thefts had gotten so out of control that facial recognition was needed. Usually, at least once a day, Facewatch alerts you that someone on your watch list has entered the store.
Facial recognition technology is proliferating as Western countries grapple with advances brought about by artificial intelligence. The European Union is drafting rules that would ban many uses of facial recognition, while New York City Mayor Eric Adams has encouraged retailers to test the crime-fighting technology. MSG Entertainment, the owner of Madison Square Garden and Radio City Music Hall, has used automated facial recognition to deny entry to lawyers whose firms have sued the company.
Among democratic nations, Britain leads the way in the use of live facial recognition, with courts and regulators approving its use. Police in London and Cardiff are experimenting with technology to identify wanted criminals as they walk down the street. In May, it was used to scan the crowds at the coronation of King Carlos III.
But its use by retailers has drawn criticism as a disproportionate remedy for petty crime. People have little way of knowing that they are on the watch list or how to appeal. In a lawsuit last year, Big Brother Watch, a civil society group, called it “extremely Orwellian.”
Fraser Sampson, Britain’s biometrics and surveillance camera commissioner, who advises the government on policy, said there was “nervousness and hesitancy” around facial recognition technology due to privacy concerns and underperforming algorithms in the past.
“But I think in terms of speed, scale, accuracy and cost, facial recognition technology can, in some areas, literally be a game changer,” he said. “That means their arrival and deployment is probably inevitable. It’s just a case of when.”
‘You can’t wait for the police to come’
Facewatch was founded in 2010 by Simon Gordon, the owner of a popular 19th-century wine bar in central London known for its cellar-like interior and popularity with pickpockets.
At the time, Mr. Gordon hired software developers to create an online tool to share security camera footage with authorities, hoping it would save police time filing incident reports and result in more arrests.
There was limited interest, but it sparked Mr. Gordon’s fascination with security technology. He followed developments in facial recognition and came up with the idea for a watch list that retailers could share and contribute to. It was like the photos of shoplifters kept by the cash register, but supercharged into a collective database to identify the bad guys in real time.
By 2018, Gordon felt the technology was ready for commercial use.
“You have to help yourself,” he said in an interview. “You can’t wait for the police to come.”
Facewatch, which licenses facial recognition software made by Real Networks and Amazon, is now inside almost 400 stores in Britain. Trained with millions of photos and videos, the systems read the biometric information of a face when the person enters a store and compare it with a database of marked people.
Facewatch’s watch list is constantly growing as stores upload photos of thieves and problematic customers. Once added, a person stays there for one year before being removed.
‘The mistakes are rare but the happen’
Every time Facewatch’s system identifies a thief, a notification is sent to a person who has passed a test to be a “super recognizer,” someone with a special talent for remembering faces. Within seconds, the super recognizer must confirm the match against the Facewatch database before an alert is sent.
But while the company has created policies to prevent misidentifications and other mistakes, mistakes do happen.
In October, a woman buying milk at a supermarket in Bristol, England, was confronted by a clerk and ordered to leave. They told her that Facewatch had marked her as a banned thief.
The woman, who asked not to be named for privacy reasons and whose story was corroborated by materials provided by her lawyer and Facewatch, said there must have been a mistake. When she contacted Facewatch a few days later, the company apologized, saying it was a case of mistaken identity.
After the woman threatened legal action, Facewatch investigated her logs. He found out that the woman had been added to the watch list due to an incident 10 months earlier that involved £20 worth of merchandise, about $25. The system “worked perfectly,” Facewatch said.
But while the technology had correctly identified the woman, it didn’t leave much room for human discretion. Neither Facewatch nor the store where the incident occurred contacted her to let her know that she was on the watch list and to ask what had happened.
The woman said that she did not remember the incident and that she had never stolen. She said she may have left after not realizing her debit card payment didn’t go through at a self-checkout kiosk.
Madeleine Stone, legal and policy officer for Big Brother Watch, said Facewatch was “normalizing airport-style security checks for everyday activities like buying a pint of milk.”
Mr Gordon declined to comment on the incident in Bristol.
In general, he said, “mistakes are rare but they do happen.” He added: “If this occurs, we acknowledge our mistake, apologize, remove any relevant data to prevent it from happening again, and offer commensurate compensation.”
Privacy Office Approved
Civil liberties groups have raised concerns about Facewatch, suggesting that its implementation to prevent petty crime might be illegal under Britain’s privacy law, which requires biometric technologies to be in the “substantial public interest.”
The UK’s Information Commissioner’s Office, the UK’s privacy regulator, has carried out a year-long investigation into Facewatch. The office concluded in March that the Facewatch system was allowed by lawbut only after the company made changes to the way it operated.
Stephen Bonner, the office’s deputy commissioner for regulatory oversight, said in an interview that an investigation had led Facewatch to change its policies: It would put up more signs in stores, share between stores only information about serious and violent criminals, and send alerts only about repeat offenders That means people won’t be put on the watch list after just one minor offense, as happened to the woman in Bristol.
“That reduces the amount of personal data that is kept, reduces the chances of people being unfairly added to this type of list, and increases the likelihood that it will be accurate,” Bonner said. Technology, he said, “is not unlike having really good security guards.”.”
Liam Ardern, operations manager at Lawrence Hunt, who owns 23 Spar convenience stores that use Facewatch, estimates that the technology has saved the company more than £50,000 since 2020.
He called the privacy risks of facial recognition overblown. The only example of misidentification that she recalled was when a man was mistaken for his identical twin, who he had shoplifted. Critics overlook that stores like hers operate on thin profit margins, she said.
“It’s easy for them to say: ‘No, it’s against human rights,’” Mr. Ardern said. If theft isn’t reduced, he said, his stores will have to raise prices or cut staff.