The Metropolitan Police plans to deploy live facial recognition at this year’s Notting Hill Carnival, prompting a judicial review and renewed debate over whether biometric surveillance can be used safely and lawfully without normalising state intrusion or worsening racial bias.
The Metropolitan Police’s plan to use live facial recognition at Notting Hill Carnival has sparked fierce debate once again, with critics arguing that the weaponisation of biometric tech in a carnival of culture risks normalising state intrusion while others insist the tool is essential to keeping crowds safe. In a defensive stance, the Met says the technology will be used “in a non-discriminatory way” and that the current algorithm “does not perform in a way which exhibits bias.” It insists LFR will be deployed as one element of a broader policing strategy, not as a blanket surveillance sweep.
The force says cameras will be positioned where the carnival meets approaches and exits, integrated into intelligence-led, preventative activity rather than a blanket surveillance net. In an operational briefing, the Met framed LFR as one tool among searches, screening arches and targeted interventions designed to prevent weapons and known offenders from reaching the site.
Proponents of tougher policing argue that the scale of policing challenges at Notting Hill justifies stronger measures. The Met’s 2024 carnival figures show 349 arrests over the weekend, including for homicide, rape, possession of weapons and other violence- and sexual-offences-related wrongdoing. Officials say these offences threaten public safety and form the basis for new tactics, including preparatory arrests and searches intended to stop violence before it occurs.
Campaigners and civil-liberties groups contest that assessment. A coalition including anti-racist and rights groups argues that LFR will “exacerbate concerns about abuses of state power and racial discrimination.” They maintain the technology is prone to bias against ethnic minorities and women and contend that introducing it at a cultural festival celebrating the African-Caribbean community is especially fraught.
A legal challenge has already begun. Shaun Thompson, an anti-knife campaigner who says he was wrongly identified by LFR, has filed for judicial review; his account—being stopped, held and asked for fingerprints after a misidentification—serves as a central example cited by activists of the technology’s harms. Rebecca Vincent of Big Brother Watch warned there is “no legislation governing live facial recognition,” asking why there is a rush to expand use of what she calls an Orwellian tool. The policing model, she argued, should be rooted in policing by consent, not in a system attendees did not consent to.
The Met acknowledges the controversy and points to earlier trials. The force concedes that 2016–17 trials at the carnival did not inspire public confidence—one system flagged more than a hundred people as potential suspects erroneously—but says the algorithm has since been improved and independently tested. It presents these claims as proof that today’s balance between accuracy and operational need has shifted since the earlier deployments.
Yet questions remain about the legal and privacy safeguards surrounding biometric systems. The Met cites duties under the Equality Act 2010, the European Convention on Human Rights and data-protection laws. Independent guidance from the Information Commissioner’s Office warns that biometric recognition data used for identification is a special-category personal data, requiring careful consideration of necessity, proportionality and an appropriate legal basis. The ICO guidance makes clear that these are not mere technicalities but legal thresholds for any public authority deploying such systems.
Rights advocates warn that independent testing of algorithms does not erase broader risks. Analyses over time have consistently found higher error rates for women and people of color, and misidentification can lead to wrongful stops and a chilling effect on freedom of expression and assembly. They point to historical misuses of data in policing as a reminder that technology alone cannot fix structural bias.
This clash at Notting Hill sits within a wider debate about balancing urgent public-safety concerns with enduring civil-liberties safeguards. With a judicial review underway, campaign groups are pressing for clearer parliamentary rules, statutory oversight, and transparent criteria for any future public-space biometric deployments. For attendees and observers, the pressing question is not only whether the cameras will function as the Met claims, but whether there exists a robust legal and ethical framework to hold the force to account if they do not.
From a Reform-style perspective, the argument often made in favour of tough-on-crime policies is that public safety must be backed by modern tools and a proactive policing regime. This line stresses that communities deserve effective protection against violent crime, and that police resources should be equipped with capable technologies to deter and disrupt threats before they escalate. At the same time, it insists any expansion of biometric or surveillance capabilities must stand up to stringent oversight, clear legal authority, and measurable safeguards to prevent unnecessary encroachment on civil liberties. The aim, in this view, is not to erode rights but to restore public confidence that the state will act decisively to protect citizens while remaining answerable to Parliament and the public.
With a high-profile deployment imminent and a judicial review in motion, the episode is likely to sharpen calls for statutory oversight, transparent criteria, and robust, independent scrutiny of biometric systems used in public spaces. Critics will press hard for a legal framework that can reliably prevent misidentification and bias, while supporters will argue that, in an era of serious violence, refusing to equip the police with proven tools is a luxury the public cannot afford.
Ultimately, the Notting Hill controversy raises a fundamental question for a country seeking security without surrendering freedoms: can the state harness advanced policing technologies in a manner that genuinely enhances safety, while preserving the liberties that underpin a free society? That tension—between urgent security needs and the protections of civil rights—will define how future policing technologies are deployed, scrutinised, and governed.
Source: Noah Wire Services
- https://www.theguardian.com/culture/2025/aug/19/met-chief-rejects-calls-scrap-live-facial-recognition-notting-hill-carnival – Please view link – unable to able to access data
- https://www.theguardian.com/culture/2025/aug/19/met-chief-rejects-calls-scrap-live-facial-recognition-notting-hill-carnival – This Guardian piece reports Metropolitan Police Commissioner Mark Rowley’s rebuttal to calls for the abandonment of live facial recognition (LFR) at Notting Hill Carnival. Rowley wrote that the technology will be used in a non-discriminatory way, that the algorithm has improved following earlier trials, and defended its role in locating a small minority responsible for serious crimes. The article outlines campaigners’ concerns — raised by 11 anti-racist and civil liberties organisations — about racial bias, lack of government legislation and a pending legal challenge by Shaun Thompson, who alleges wrongful identification by LFR. It notes previous problematic trials in 2016–17.
- https://news.met.police.uk/news/notting-hill-carnival-update-on-incidents-and-arrests-487332 – This Metropolitan Police news release gives a factual breakdown of incidents and arrests at the 2024 Notting Hill Carnival. It records eight reported stabbings, numerous assaults on officers and a total of 349 arrests over the weekend, with a full offence-by-offence table. The release details categories including possession of offensive weapons, assaults on emergency workers, sexual offences and drug offences, and specifies numbers for Sunday and Monday. The statement was issued by the force to inform the public about policing outcomes at the event and to provide transparency on the scale and type of criminality dealt with during the carnival weekend.
- https://news.met.police.uk/news/met-appeals-for-publics-help-to-keep-carnival-safe-in-2025-499483 – This Metropolitan Police announcement outlines operational plans for policing the 2025 Notting Hill Carnival and appeals for public assistance. It confirms the force will deploy live facial recognition cameras on approaches to and from the carnival, outside its boundary, as part of intelligence-led preventative activity. The statement explains preparatory arrests and searches aimed at stopping weapons and serious offenders before they arrive, and describes the use of screening arches and stop-and-search at busy entry points. The release frames LFR as one element within a broader strategy to deter violence and keep attendees safe while inviting public vigilance and cooperation.
- https://bigbrotherwatch.org.uk/press-releases/big-brother-watchs-response-to-facial-recognition-at-notting-hill-carnival/ – Big Brother Watch’s press release articulates the organisation’s objection to the Metropolitan Police’s planned use of live facial recognition at Notting Hill Carnival. Quoting interim director Rebecca Vincent, it argues that LFR is invasive, carries a documented risk of bias against minority groups and should not be deployed at a cultural celebration of the African-Caribbean community. The statement highlights the absence of a clear legislative framework, the potential for mass biometric capture without consent, and announces the group’s support for legal action and crowdfunding efforts to challenge police use of LFR, emphasising civil liberties and the need for accountable oversight.
- https://www.hrw.org/news/2019/06/06/history-shows-why-police-use-facial-recognition-tech-can-threaten-rights – Human Rights Watch examines the risks posed by police use of facial recognition technology, arguing it can threaten fundamental rights. The piece summarises research showing higher error rates for women and people of colour and highlights how misidentification can lead to wrongful stops, privacy intrusions and chilling effects on freedom of expression and association. Drawing historical parallels with past abuses of data for discriminatory policing, the report calls for stringent limits on the technology’s use and urges companies and governments to curb deployments until robust safeguards, oversight and legal frameworks are established to protect human rights.
- https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/lawful-basis/biometric-data-guidance-biometric-recognition/biometric-recognition/ – This guidance from the UK Information Commissioner’s Office explains how biometric recognition and biometric data are treated under UK data protection law. It clarifies that biometric data used for unique identification is special category personal data under the UK GDPR, outlining lawful bases and the need for a separate condition to process such sensitive information. The page discusses how and when explicit consent might be required, describes lawful processing conditions, emphasises necessity and proportionality, and provides practical advice for organisations and public authorities considering biometric recognition systems, stressing legal obligations and safeguards to protect individual rights.
Noah Fact Check Pro
The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.
Freshness check
Score:
10
Notes:
The narrative is current, published on 19 August 2025, and pertains to the upcoming Notting Hill Carnival. No evidence of recycled or outdated content was found. The report is based on a recent press release from the Metropolitan Police Commissioner, which typically warrants a high freshness score.
Quotes check
Score:
10
Notes:
The direct quotes from Mark Rowley, the Metropolitan Police Commissioner, are unique to this report. No identical quotes were found in earlier material, indicating original content.
Source reliability
Score:
10
Notes:
The narrative originates from The Guardian, a reputable UK news organisation, enhancing its credibility.
Plausability check
Score:
10
Notes:
The claims regarding the deployment of live facial recognition technology at the Notting Hill Carnival are plausible and align with previous discussions on the topic. The narrative provides specific details, including the Met’s response to concerns about racial bias and legal challenges, which are consistent with known information.
Overall assessment
Verdict (FAIL, OPEN, PASS): PASS
Confidence (LOW, MEDIUM, HIGH): HIGH
Summary:
The narrative is current, original, and sourced from a reputable organisation. The claims are plausible and supported by specific details, with no evidence of disinformation or recycled content.