Eleven civil‑liberties and anti‑racist groups have written to the Metropolitan Police commissioner demanding the force abandon plans to deploy live facial recognition at next weekend’s Notting Hill Carnival, arguing the technology is prone to racial and gender bias, has already prompted a High Court challenge and risks deepening mistrust in policing of the African‑Caribbean community.
Eleven civil‑liberties and anti‑racist organisations have written to the Metropolitan Police commissioner urging him to abandon plans to deploy live facial recognition (LFR) at next weekend’s Notting Hill Carnival, warning the technology is “riven with racial bias” and that its use is already subject to a high‑court legal challenge. Campaigners say deploying instant face‑matching cameras at an event that celebrates the African‑Caribbean community will deepen mistrust of policing and risk reproducing discriminatory harms.
The open letter, signed by groups including the Runnymede Trust, Liberty, Big Brother Watch, Race on the Agenda and Human Rights Watch, argues LFR is less accurate for women and people of colour and that using it at Carnival “unfairly targets the community that carnival exists to celebrate”. The signatories point to the Met’s troubled history on race — including Baroness Casey’s finding of institutional racism — as a reason why introducing intrusive biometric surveillance at such an event is particularly fraught.
Independent technical studies are central to the campaigners’ case. The National Physical Laboratory (NPL), commissioned to test operational systems used by the Met and South Wales Police, found that some algorithm settings produced statistically indistinguishable performance across demographic groups, but that lowering the face‑match threshold increased false positives and produced bias against Black people. Earlier academic work — notably the 2018 Gender Shades project — also demonstrated large disparities in error rates by gender and skin tone, with the worst performance seen for darker‑skinned women while light‑skinned men experienced far lower error rates. Campaigners say those findings together show the accuracy of LFR is highly sensitive to how police configure and operate the system.
The prospect of misidentification is not hypothetical. A High Court challenge brought by Shaun Thompson and publicised by campaigners recounts his account of being stopped, detained for about half an hour and asked for fingerprints after an LFR alert, despite producing identity documents. Thompson has described the experience as akin to “stop and search on steroids”, and Big Brother Watch framed the legal action as part of a broader concern over discriminatory impacts and a lack of robust safeguards.
The Met has defended its operational plans. In a police statement the force said LFR cameras will be sited on the approaches to and exits from the carnival, outside the event’s formal boundaries, and used to identify missing persons, people wanted by the courts and those subject to sexual harm prevention orders. The force points to internal procedures intended to limit harms — including human review of any LFR alert and the deletion of biometric data where no match is found — and to arrest figures it attributes to the technology. Matt Ward, the deputy assistant commissioner overseeing policing for Carnival, told the Guardian that LFR is “a reliable and effective tool” and stressed that independent testing had found the system accurate at the thresholds the Met uses, while acknowledging public misconceptions about the technology in Black and other minority ethnic communities.
That operational defence sits against a wider government push to expand LFR. The Home Office has announced a neighbourhood policing programme that rolls out additional LFR vans to several forces, describing the tool as an intelligence‑led means of identifying high‑harm offenders and promising work on legal safeguards and oversight. The Home Secretary has signalled an intention to draw up a new legal framework for the technology, but campaigners say the promise of future rules does not address current harms or the absence of statutory limits on how and where police may deploy biometric surveillance.
Campaigners also point to examples of past police uses that appear to have strayed from the most serious‑offender framing: civil society groups have raised concerns about LFR being used on occasions such as to target ticket touts, and independent testing shows thresholds and operational settings materially affect who is misidentified. On the ground at Carnival, the Met will deploy screening arches at busy entry points and reserve stop‑and‑search powers; critics warn those powers combined with LFR could amplify disproportionate enforcement in narrow streets where large crowds gather.
With an estimated 2 million people attending the two‑day event and around 7,000 police officers and staff due to be deployed each day, the controversy is likely to intensify in the coming days. Campaigners have urged the commissioner to halt any deployment until the High Court has ruled and until clearer statutory safeguards are in place; meanwhile the Met says it will proceed with the operation it believes balances public safety with procedural protections. The outcome of the legal challenge and any ministerial moves on rules for LFR will determine whether that balance holds or is reset by the courts or Parliament.
Reference Map:
Reference Map:
- Paragraph 1 – [1], [2]
- Paragraph 2 – [1], [2]
- Paragraph 3 – [4], [5], [1]
- Paragraph 4 – [6], [1]
- Paragraph 5 – [3], [1]
- Paragraph 6 – [7], [1]
- Paragraph 7 – [1], [4], [3]
- Paragraph 8 – [1], [6], [2], [7]
Source: Noah Wire Services
- https://www.theguardian.com/uk-news/2025/aug/16/facial-recognition-cameras-too-racially-biased-to-use-at-notting-hill-carnival-say-campaigners – Please view link – unable to able to access data
- https://www.theguardian.com/uk-news/2025/aug/16/facial-recognition-cameras-too-racially-biased-to-use-at-notting-hill-carnival-say-campaigners – This Guardian report describes an open letter from eleven civil liberties and anti‑racist organisations asking the Metropolitan Police commissioner to abandon plans to deploy live facial recognition (LFR) at the Notting Hill Carnival. It details campaigners’ concerns that LFR is racially biased and less accurate for women and people of colour, and notes a high‑court legal challenge brought by Shaun Thompson after an alleged mistaken identification. The story cites independent testing by the National Physical Laboratory and the MIT Gender Shades study, reports Met statements on arrests attributed to LFR and records Home Secretary Yvette Cooper’s intention to draw up a new legal framework.
- https://news.met.police.uk/news/met-appeals-for-publics-help-to-keep-carnival-safe-in-2025-499483 – The Metropolitan Police news release outlines policing plans for the 2025 Notting Hill Carnival, confirming LFR cameras will be used on approaches to and exits from the event but outside the carnival boundaries. It explains operational aims — to identify wanted people, missing persons and those subject to sexual harm prevention orders — and describes screening arches, stop‑and‑search powers, and partnerships with Crimestoppers. The statement sets out procedural safeguards, including officer review of any LFR alert and deletion of biometric data where no match is found, and gives deployment figures and context for how the force says the technology will be applied.
- https://science.police.uk/delivery/resources/operational-testing-of-facial-recognition-technology/ – This police science delivery page summarises independent operational testing of the Metropolitan Police and South Wales Police facial recognition systems, referencing the National Physical Laboratory’s commissioned study. It explains the NPL methodology and findings: that for certain algorithm settings there was no statistically significant difference in demographic performance, while lower face‑match thresholds increased false positives and showed bias against Black people. The page describes distinctions between Live Facial Recognition (LFR), Retrospective FR and Operator‑Initiated FR, highlights practical caveats and links to the full NPL report for detailed technical results and recommended operational configurations.
- https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212 – MIT News summarises the 2018 Gender Shades study by Joy Buolamwini and colleagues, which evaluated commercial facial‑analysis systems and uncovered major disparities by gender and skin tone. Using a balanced dataset and the Fitzpatrick skin‑type scale, the research found error rates as high as about 34.7% for darker‑skinned women while error rates for light‑skinned men were below 1%. The piece explains the study’s methods, implications for systems trained on skewed datasets, and how the findings prompted industry efforts to rebalance training data and reassess benchmarks for fairness in facial recognition and related AI technologies.
- https://bigbrotherwatch.org.uk/press-releases/met-police-face-major-legal-challenge-over-use-of-live-facial-recognition-technology/ – Big Brother Watch’s press release announces a High Court challenge against the Metropolitan Police brought by Shaun Thompson and the campaign group, following an alleged wrongful identification by the Met’s live facial recognition system. The release recounts Thompson’s account of being stopped, detained for around half an hour and asked for fingerprints after an LFR alert, despite presenting identity documents. It frames the legal action as part of wider concerns about misidentification, discriminatory impacts on Black communities and the absence of robust legal safeguards, calling for urgent restraint on police use of LFR pending judicial resolution.
- https://www.gov.uk/government/news/live-facial-recognition-technology-to-catch-high-harm-offenders – This UK Government (Home Office) news item sets out plans to expand live facial recognition capability as part of a neighbourhood policing programme: ten new LFR vans will be rolled out across seven police forces. It describes the stated purpose of the technology — to identify high‑harm offenders such as those wanted by the courts or breaching sexual harm prevention orders — and says deployments will operate under defined rules and intelligence‑led use. The announcement also refers to planned work on legal safeguards and oversight intended to govern the expanded national deployment of LFR tools.
Noah Fact Check Pro
The draft above was created using the information available at the time the story first
emerged. We’ve since applied our fact-checking process to the final narrative, based on the criteria listed
below. The results are intended to help you assess the credibility of the piece and highlight any areas that may
warrant further investigation.
Freshness check
Score:
10
Notes:
The narrative is current, published on 16 August 2025, and addresses recent developments regarding the Metropolitan Police’s plans to deploy live facial recognition (LFR) at the upcoming Notting Hill Carnival. The concerns raised by civil liberties and anti-racist groups are timely and directly related to the imminent event. No evidence suggests that this content has been recycled or republished from earlier sources. The inclusion of updated data and references to recent events supports a high freshness score.
Quotes check
Score:
10
Notes:
The direct quotes from campaigners and organisations, such as the Runnymede Trust, Liberty, Big Brother Watch, Race on the Agenda, and Human Rights Watch, are unique to this report. No identical quotes have been found in earlier material, indicating that the content is original and not reused. The wording of the quotes varies from previous statements, suggesting that they are newly provided for this report.
Source reliability
Score:
10
Notes:
The narrative originates from The Guardian, a reputable and established news organisation known for its investigative journalism and commitment to accuracy. The inclusion of direct quotes from well-known civil liberties and anti-racist organisations further enhances the credibility of the report.
Plausability check
Score:
10
Notes:
The claims made in the narrative are plausible and align with known concerns regarding the use of facial recognition technology by law enforcement. The Metropolitan Police’s plans to deploy LFR at the Notting Hill Carnival have been previously reported, and the concerns raised by campaigners about racial bias and legal challenges are consistent with ongoing debates about the technology’s use. The narrative provides specific details, such as the organisations involved and the content of their letter, which are verifiable and support the plausibility of the claims.
Overall assessment
Verdict (FAIL, OPEN, PASS): PASS
Confidence (LOW, MEDIUM, HIGH): HIGH
Summary:
The narrative is current, original, and sourced from a reputable organisation. The claims made are plausible and supported by verifiable details. No significant credibility risks have been identified, and the content appears to be accurate and trustworthy.