Calls to ban use of facial recognition
Police are deploying an ‘Orwellian mass surveillance tool’ in the capital
Friday, 27th October 2023 — By Frankie Lister-Fell

Surveillance cameras used in the capital [Big Brother Watch]
THE Metropolitan Police Service is deploying an “Orwellian mass surveillance tool” in Westminster, campaigners warn.
Big Brother Watch has criticised the police for using live facial recognition (LFR), at least eight times since a pilot of the technology was launched in 2022, including last month in Soho.
The cameras scan anyone who walks past and every face is mapped and converted into a “biometric face print”, similar to fingerprints, that is checked against a police watch list looking for matches, without consent from those walking by.
The group’s Madeleine Stone told Extra: “It’s the same thing as having your fingerprint taken by the police in order to walk down the street.”
On September 9 a man was stopped after walking through a police “deployment area” in Wardour Street.
His face matched a man on a system who had failed to show up to a sentencing hearing. On being stopped, police said the man assaulted an officer.
The MPS say the technology is used only to “locate dangerous individuals”.
There are two types of facial recognition used by the police: retrospective recognition, where images of someone suspected of doing something illegal are compared with custody images databases, and LFR.
LFR operates from police vans that are deployed in a specific area. Signs are displayed at the location to say that facial recognition is taking place.
Last week MPS commissioner Sir Mark Rowley said the force’s new plans to use retrospective facial recognition to identify shoplifters “was pushing boundaries”.
But human rights organisations have said it’s discriminatory, inaccurate, and a “big privacy concern”.
Ms Stone said: “Police targeting Westminster’s residents with this dangerous mass surveillance tool treats them like suspects, erodes public freedoms, and wastes public money.
“Live facial recognition is not an efficient crime-fighting tool, with the police’s own statistics revealing that more than eight out of 10 facial recognition matches have been inaccurate since its introduction.
“This is an Orwellian mass surveillance tool rarely seen outside of Russia and China and has absolutely no place in London.”
Ms Stone said they’ve seen people, including children, pulled aside by the police following an incorrect LFR match and forced to give their fingerprints or show their ID to prove they’re not a criminal.
She said there was an algorithmic bias, where studies have shown live facial recognition is “less accurate for women and people of colour”. And certain communities are over-policed and more likely to end up on watch lists, which means they are more likely to be flagged by the technology wrongly or rightly.
Caroline Russell, chair of the London Assembls police committee, said: “The lack of transparency about the make-up of watch lists and the purpose of deployments is unhelpful in the context of the Met trying to improve Londoners’ trust and confidence in policing. LFR is a really dangerous technology and it’s very difficult to see how it’s actually making a difference in terms of policing.”
The London Policing Ethics Panel has reported that younger people and Asian, black, and mixed ethnic groups are more likely not to attend events if they know they will be monitored with LFR.
A MPS spokesperson said: “The Met has primarily focused the use of LFR on the most serious crimes; locating people wanted for violent offences, including knife and gun crime, or those with outstanding warrants who are proving hard to find. Operational deployments of LFR technology have been in support of longer-term violence reduction initiatives and have resulted in a number of arrests for serious offences including conspiracy to supply Class A drugs, assault on emergency service workers, possession with intent to supply Class A & B drugs, grievous bodily harm and being unlawfully at large having escaped from prison.”
It added that alert rates across their deployments are between 0 per cent and 0.08 per cent, although Big Brother Watch claims the MPS uses a different formula (the number of false matches against the total number of faces seen) to calculate accuracy.
Human rights organisation Liberty has launched a petition to the home office to ban facial recognition technology.
Emmanuelle Andrews, Liberty policy and campaigns manager, said: “We cannot police and surveil our way out of social problems and facial recognition is the wrong response, especially to people in need of support who are struggling to survive amid the cost-of-living crisis.
“The safest thing to do for everyone is ban facial recognition technology.”