When Shaun Thompson walked out of London Bridge station after a shift with the community outreach group Street Fathers earlier this year, he expected to go home. Instead he says he was grabbed by a group of Metropolitan Police officers who told him, “Sorry, sir – we believe you’re a wanted man.” The stop, Thompson recounts in The Independent, followed an alert from a live facial recognition (LFR) system; officers compared his features to a watchlist photo, suggested taking his fingerprints and only backed off after he refused. He says the encounter left him shaken and convinced the technology is both fallible and liable to be used in racially biased ways.

Thompson’s account has been used by civil liberties groups as emblematic of a wider problem with police use of LFR. Campaigners say he was treated as a suspect rather than as someone merely “recognised”, and that officers’ handling of the stop — including the demand for fingerprints — raised questions about procedure and accountability. Big Brother Watch has since supported Thompson’s bid to bring legal proceedings, arguing that routine LFR use leads to intrusive and discriminatory policing.

Thompson took legal action against the Met; the case was settled, but campaigners say the settlement does not resolve the broader legal and ethical issues. Big Brother Watch says it has been granted permission to pursue a judicial review alongside other campaigners, aiming to force clearer limits and greater transparency on how and where LFR is deployed.

Those tensions have come into sharper relief because of a recent government decision to expand live facial recognition across England. In a Home Office announcement on 13 August 2025, ministers said ten vans equipped with LFR cameras will be rolled out to seven police forces — Greater Manchester, West Yorkshire, Bedfordshire, Surrey and Sussex (joint), and Thames Valley and Hampshire (joint) — as part of a neighbourhood‑policing initiative. The department said the capability will be used only against bespoke police watchlists to target “high‑harm” offenders, and that independent testing by the National Physical Laboratory has informed the proposal; ministers also promised a public consultation and a new legal framework to enshrine safeguards and oversight.

The Metropolitan Police has already defended its use of LFR in specific contexts. For this year’s Notting Hill Carnival, the force confirmed it would operate LFR cameras on approaches to and from the event — outside the formal event boundary — to help identify wanted individuals, missing people and those subject to sexual‑harm prevention orders. The Met says alerts generated by the system are reviewed by officers, non‑matches are deleted, and additional checks are carried out before any enforcement action is taken.

But technical and academic studies underline why campaigners remain wary. The US National Institute of Standards and Technology found, in a large operational test of face recognition algorithms, systematic demographic differentials: many algorithms produced higher rates of false positives and false negatives for some groups, with substantially more errors for Asian and African‑American faces compared with Caucasian faces. NIST warned that performance depends heavily on the specific algorithm and the operational setting — and that one‑to‑many searches, the mode typically used by police watchlists, carry particular risk.

The expansion of LFR into routine policing has been described in media investigations as a quietly growing part of the police arsenal, provoking a fraught debate between those who point to arrests and alleged public‑safety benefits and those who see a technology being embedded without adequate legal guardrails. Reporting has highlighted internal documents, the uneven pace of deployment between forces, and gaps in independent oversight that have left critics calling for tighter statutory controls.

For many campaigners and for Thompson, the issue is not merely technical accuracy but who is most likely to be stopped and where. Thompson framed his experience in historical context, recalling how discretionary powers such as the old “sus” laws disproportionately targeted young Black people; he argues that new surveillance tools risk repeating that pattern if they are concentrated in Black and brown neighbourhoods. His account also draws a stark contrast with another man stopped by officers that day — a white Eastern European who, Thompson says, was greeted as “recognised” rather than branded “wanted”.

The government and some police forces maintain that rigorous testing, operational safeguards and forthcoming legislation will manage the risks. Campaigners, lawyers and privacy groups say those assurances fall short without binding legal limits, transparent audit trails and genuinely independent oversight. As ministers prepare a public consultation and draft a statutory framework, the debate turns on whether technical validation and policy promises will be enough to prevent fresh injustices — or whether the routine use of LFR will harden a new, technology‑driven layer of discriminatory policing.

Until those legal and oversight questions are settled, Thompson’s experience will continue to be cited by those urging caution: not as an argument against law enforcement’s desire to tackle serious crime, but as a warning that imperfect tools, deployed without sufficient checks, risk repeating long‑standing inequalities in who is treated as suspect and who is treated as a citizen.

📌 Reference Map:

Reference Map:

Source: Noah Wire Services