Twenty years on the run. Five seconds on camera
The arrest of a woman using a two-decade-old image taken when she was a teenager has raised fresh questions about just how powerful modern facial recognition systems have become.
When the Metropolitan Police Service (MPS) announced the results of its six-month live facial recognition (LFR) trial in Croydon earlier this week, the headline figures already appeared impressive.
Over the course of 24 operations there had been 173 arrests – the equivalent of one every 35 minutes. The force also said crime in the area fell by 10.5 per cent during the pilot period, including a 21 per cent reduction in violence against women and girls offences.
Some of the arrests made during the operation were highlighted with additional detail, including one which involved “a 36-year-old woman who had been unlawfully at large for more than 20 years and was wanted for failing to appear at court for an assault in 2004”.
The MPS has since confirmed to Police Professional that the image the system used to identify the woman was one taken in 2004, when she would have been just 16-years-old.
It is a result likely to intensify debate both among those in favour of and those lobbying against the ever-expanding use of LFR in the UK. It cuts directly against a substantial body of research suggesting that facial recognition systems become significantly less accurate as the time gap between the reference image and live subject grows.
The question now is whether this single result represents a genuine leap in the capabilities of the technology, or whether something else is at play.
The MPS said the ability to make such a match would depend on the quality of the original photograph and that the kind of biometric data used in facial recognition, such as the distance between eyes, jawline and shape of a nose, don’t radically alter with age. But studies say different.
Decline over time
In 2022 New Scientist reported on the work of a team of researchers led by PhD candidate Marcel Grimmer working at the Norwegian University of Science and Technology. In order to see how facial recognition systems coped with the passage of time, they created 50,000 AI-generated human faces and subjected them to synthetic ageing.
Using open-source software (commercial systems guard their algorithms closely) the team concluded that after just five years, identifications began to fail. After 20 years of ageing, the chance of a positive identification was almost zero.
However, synthetic ageing studies do not always replicate the variability and image quality found in operational policing environments.
Other research suggests that, for commercial, state-of-the-art systems, face recognition accuracy remains stable for up to 10 years, but that figure relates to adult faces and subjects over the age of eighteen.
Studies on the specific problem of matching a juvenile image to an adult face have found that even high-end machine-learning systems produce relatively poor accuracy when the time gap spans the transition from adolescence into adulthood – precisely the age range involved in the MPS case.
The key issue is one of biology. Adolescence often involves rapid and sometimes substantial changes in bone structure, soft tissue distribution and skin texture whereas adult faces have a more stable trajectory. Live facial recognition technology works by analysing facial geometry rather than relying solely on surface appearance. However, the transition to adulthood can often shift that geometry.
However, not everyone ages in the same way and some people will just naturally retain more of the structural geometry from their teenage years as time goes by. The woman caught in Croydon may simply have fallen into that category.
We also know that the algorithms used in current LFR deployments are considerably more sophisticated than those evaluated in the research literature from even just a few years ago.
Deep learning architectures trained on very large datasets have been shown to extract identity-diagnostic features that are more robust to ageing than earlier systems.
Research from the University of Galway has demonstrated that models fine-tuned with synthetic ageing data show meaningfully improved performance across large age intervals, including ten, twenty, and thirty-year gaps. The commercial systems now deployed by UK forces are likely to be operating at the leading edge of this capability.
It’s also important to bear in mind that LFR in a policing context doesn’t need to be perfect as it doesn’t make decisions in isolation. The systems simply flag potential matches and prompt human officers to approach and verify. To quote the National Police Chiefs’ Council, facial matches should be treated as intelligence, not fact.
Evidence of this emerged during the Croydon pilot when the system produced a false alert. The individual concerned was spoken to by officers and then released.
According to the MPS, this was the one and only false alert during the entire trial period, despite the faces of some 470,000 members of the public being captured by the cameras.
Such results are particularly notable given continuing concerns around demographic bias in facial recognition systems leading Essex Police to pause its use of the technology.
The force had commissioned University of Cambridge academics to conduct a study, which involved 188 actors walking past cameras being actively deployed from marked police vans in Chelmsford. The results showed about half of the people on a watchlist were correctly identified and that false positives were extremely rare.
However, the report noted that four of the six people misidentified in the study were Black, despite Black people representing only one in four of the sample. The researchers said this was “unlikely to be due to chance alone”.
Last month Liberty Investigates and the Guardian reported that Thames Valley Police arrested a man for a burglary in a city he had never visited 100 miles (161km) away after retrospective face scanning software confused him with another person of south Asian heritage.
The man, Alvi Choudhury, was flagged despite the CCTV footage of the suspect showing someone at least ten years younger. Choudhury was held in custody for nearly 10 hours. He plans to take legal action against the police.
Regulation catches up
As the operational capabilities of LFR continue to expand, pressure is also growing for a clearer legal framework governing how the technology is deployed.
Facial recognition is currently in use by 13 police forces: London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.
More are set to follow. In January, the home secretary, Shabana Mahmood, announced the number of LFR vans would increase five-fold, with 50 available to every police force in England and Wales.
Civil liberties groups such as Big Brother Watch and Liberty have long expressed concern about the lack of oversight and regulation in the use of both LFR and retrospective scanning software.
The Croydon case may also reignite debate around proportionality, particularly concerning how long individuals remain on watchlists and what level of offending justifies inclusion in proactive facial recognition deployments.
These concerns are set to be addressed in part by new provisions introduced during the King’s Speech. These will: “Establish a new legal framework to underpin law enforcement use of facial recognition and similar technologies, making it clear when use of these technologies can be justified, including creation of a single, expert regulatory body to provide independent advice and oversight.
“This will be world-leading and is essential for boosting public and policing confidence in the use of these innovative technologies, which has the potential to transform crime outcomes while also generating major efficiencies.”
A statement by Liberty welcomed the announcement but warned: “The new framework will need to include clear and consistent rules around how the police use facial recognition to ensure the rights of the public are protected at all times. We will also need to see rigorous oversight from the new regulatory body.”

