Facial recognition trial at London British Transport Police Launch Live Facial Recognition Trial at London Bridge Amid Growing Scrutiny
- Safer Highways
- Feb 24
- 3 min read

British Transport Police (BTP) has begun a six-month live facial recognition (LFR) trial at London Bridge station, deploying AI-powered cameras to identify individuals wanted for serious offences.
The pilot, which went live on 11 February, marks the latest expansion of biometric surveillance across the UK’s transport network — and it is already drawing criticism from privacy campaigners.
The system operates within a clearly marked “recognition zone” inside the station. Cameras scan faces passing through the area and compare them against a pre-set watchlist of individuals sought in connection with serious crimes. If the algorithm flags a potential match, an officer reviews the alert before deciding whether to intervene.
BTP says the technology — powered by NEC’s Neoface algorithm and independently tested by the National Physical Laboratory — is configured to minimise false positives and ensure fairness. According to police, biometric data belonging to individuals not on the watchlist is deleted almost immediately.
Chief Superintendent Chris Casey, who is overseeing the pilot, said the trial aims to assess how the system performs in a live railway environment.
“This is a trial of the technology to assess how it performs in a railway setting,” he said. “The initiative follows a significant amount of research and planning and forms part of BTP’s commitment to using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offences.”
Passengers who do not wish to pass through the recognition zone are being offered alternative routes, and signage has been installed at participating stations.
Questions Over Accuracy and Oversight
Despite assurances from police, live facial recognition remains controversial.
Earlier AI monitoring deployments — including Transport for London trials focused on fare evasion — faced criticism after children walking closely behind parents were incorrectly flagged as potential offenders.
Civil liberties groups argue that such incidents demonstrate the inherent risks of automated surveillance in crowded environments.
The Metropolitan Police has used live facial recognition extensively over the past year, deploying the technology 231 times and scanning around four million faces. According to court submissions made during a recent judicial review, 801 arrests were made “specifically as a result of LFR.”
Representing the Met, Anya Proops KC described the task of locating wanted individuals in London as “akin to looking for stray needles in an enormous, exceptionally dense haystack.” She argued that the privacy intrusion is minimal because data from non-matches is deleted “a fraction of a second” after capture.
However, campaign group Big Brother Watch is challenging the expanding use of the technology. London resident Shaun Thompson is also pursuing legal action after he was incorrectly identified by facial recognition cameras in 2024 and detained for around 30 minutes before being released.
Matthew Feeney of Big Brother Watch said: “We all want train passengers to travel safely, but subjecting law-abiding passengers to mass biometric surveillance is a disproportionate and disturbing response. Facial recognition technology remains unregulated in the UK and police forces are writing their own rules.”
A Legal Grey Area
The legal framework governing facial recognition in the UK remains fragmented. A 2020 Court of Appeal ruling found that South Wales Police had deployed live facial recognition unlawfully, citing insufficient safeguards and inadequate impact assessments.
Since then, police forces have relied on a combination of common law powers, data protection legislation and human rights frameworks when using the technology. However, there is currently no single statute that specifically regulates live facial recognition.
A Home Office consultation on how the technology should be governed has yet to conclude — even as trials continue and deployments expand.
Meanwhile, ministers have signalled that artificial intelligence will play a central role in plans to modernise policing, suggesting that the use of biometric tools is likely to grow rather than recede.
Balancing Security and Privacy
Supporters argue that live facial recognition provides a powerful tool for identifying serious offenders in busy public spaces, potentially improving safety and efficiency. Critics counter that widespread biometric scanning of law-abiding citizens represents a step change in state surveillance.
The London Bridge trial will now serve as a test case not only for the technology’s operational effectiveness in transport hubs, but also for public tolerance of AI-driven policing in everyday environments.



Comments