On 21 April, the UK High Court handed the Metropolitan Police a significant victory in the battle over live facial recognition, dismissing a legal challenge to the force’s policy and clearing the way for its continued use across London. The case was brought by youth worker Shaun Thompson and Big Brother Watch director Silkie Carlo, who argued that the Met’s rules for deploying the technology were too broad, too unclear and too intrusive to satisfy human-rights law. On Tuesday, the court rejected that claim, holding that the policy had enough clarity, foreseeability and safeguards to be lawful.

Facial Recognition is Already in Place in the UK & Will Only Expand From Here
Live facial recognition is already being used in London streets, transport hubs and shopping areas, with police cameras scanning passers-by in real time and comparing their faces against watchlists of wanted people. The Met says the system is used to prevent and detect crime and to locate suspects. According to figures cited in the challenge, the force carried out 231 deployments in the previous year and scanned around 4 million faces. The Met’s own annual report recorded 203 deployments between September 2024 and September 2025.
The Human Rights Case Against the UK’s Facial Recognition System
Thompson and Carlo were challenging the Met’s revised 2024 policy governing the overt use of live facial recognition. They argued that the policy unlawfully interfered with rights to privacy, freedom of expression and freedom of assembly under Articles 8, 10 and 11 of the European Convention on Human Rights. They told the court that the rules gave officers too much discretion over when the system could be deployed, where it could be deployed and who could be added to watchlists, making it difficult for the public to predict when they might be subject to biometric scanning. Thompson’s own experience formed part of that case after he was wrongly flagged by the system and detained outside London Bridge station.
The court determined that the Met’s present policy meets the legal threshold. In the judgment summary, the court found that the policy “does not authorise arbitrary decision-making,” has “sufficient clarity and foreseeability,” and provides “adequate safeguards against abuse.” It also held that the limits on who can be placed on watchlists and where the technology may be used were enough to meet legal standards.
Responding to the Judgement: “Fight Against Facial Recognition is Far From Over”
Silkie Carlo, who is one of the claimants in this case and director of Big Brother Watch, issued a letter responding to the court’s judgement. “This is a disappointing judgment but the fight against live facial recognition mass surveillance is far from over. There has never been a more important time to stand up for the public’s rights against dystopian surveillance tech that turns us into walking ID cards and treats us like a nation of suspects. Innocent people deserve clear and strict protections from live facial recognition cameras, which should be reserved for the most serious cases rather than used to scan millions of people, and that is what the appeal will seek to achieve.
“This legal challenge, which was made possible by concerned members of the public, has already led to a change in the Met’s facial recognition policy and to a payment awarded to Mr Thompson who was misidentified by the tech and threatened with arrest. He has been courageous in challenging the police, defending his rights and now standing up for the rights of millions of others in the UK.”
Co-claimant Shaun Thompson said, “I’ve considered the court’s judgment today and decided to appeal it to protect Londoners from facial recognition being used for mass surveillance and leading to situations like mine, where I was misidentified, detained and threatened with arrest. No one should be treated like a criminal due to a computer error. I was compliant with the police, but my bank cards and passport weren’t enough to convince the police the facial recognition tech was wrong. It’s like stop and search on steroids. It’s clear the more widely this is used, the more innocent people like me risk being criminalised.
“My daily work getting knives off the streets with the Street Fathers proves we can keep London safe through community action, not by spying on the public with cameras that real criminals already know how to dodge.”
What Happens Now?
The ruling leaves two immediate consequences. First, the Metropolitan Police can continue using live facial recognition under its existing policy. Second, the judgment makes wider expansion easier. In January, the Home Office announced that the number of live facial-recognition vans would increase five-fold, from 10 to 50, with availability extended across police forces in England and Wales. Ministers presented the move as part of a broader investment in AI-assisted policing and crime prevention. The direction of travel is clear enough: facial recognition is being embedded not as an exceptional tactic but as a more routine feature of public-space policing.
Live facial recognition changes the terms on which people move through public space. A street, a station entrance or a shopping district is no longer simply a place through which citizens pass anonymously. It becomes a checkpoint, where the default assumption is that everyone can be scanned first and only a few will then be stopped. The claim that most people are deleted from the system after no match does not remove the underlying shift. Their faces are still captured, converted into biometric data and checked against police databases without any individualised suspicion. The claimants described that as mass surveillance as millions of faces are being scanned in ordinary urban life.
Final Thought
Police and ministers argue that law-abiding citizens have nothing to fear and point to arrests secured through facial recognition, including serious offenders. That angle is politically effective, and suggests that the technology may have some operational value, but it does not answer the constitutional question. The issue here is not whether the state can identify wanted criminals. It’s about whether a free society should accept constant biometric monitoring as a new “price” of safety. As soon as infrastructure is in place, it’s nearly impossible to confine. Deployment zones will inevitably widen, watchlists will expand, and technical systems built for violent offenders will be directed toward lower-level offences, protests, or broader intelligence gathering. The history of surveillance powers offers little reason to assume systematic restraint.
The Expose Urgently Needs Your Help…
Can you please help to keep the lights on with The Expose’s honest, reliable, powerful and truthful journalism?
Your Government & Big Tech organisations
try to silence & shut down The Expose.
So we need your help to ensure
we can continue to bring you the
facts the mainstream refuses to.
The government does not fund us
to publish lies and propaganda on their
behalf like the Mainstream Media.
Instead, we rely solely on your support. So
please support us in our efforts to bring
you honest, reliable, investigative journalism
today. It’s secure, quick and easy.
Please choose your preferred method below to show your support.
Categories: UK News