Building trust in technology
Emily Keaney, Deputy Commissioner for Regulatory Policy at the ICO on why data protection lies at the heart of responsible police use of facial recognition technology.
Facial recognition technology (FRT) plays a significant role in modern-day policing. At the ICO we are scrutinising how it is used in practice to ensure police forces are compliant with data protection law.
For FRT to continue to be a mainstay of modern policing, the general public in England and Wales must trust that it doesn’t put their civil liberties at risk. Our recent research reinforces this. People name accuracy (53%), proper officer training (35%) and safeguards against bias (33%) as the top factors in regulating police use of FRT.
Our latest audits
Our work includes a series of audits of police forces using FRT, most recently Essex Police and Leicestershire Police. We’ve identified a number of areas of good practice, as well as some areas for improvement, and are continuing to engage with the forces as they implement our recommendations.
We audited use of retrospective facial recognition (RFR) at Leicestershire Police (where images are compared to an existing database) as live facial recognition (LFR) wasn’t being used at that time.
Before our audit Essex Police had paused its live facial recognition deployments after identifying potential accuracy and bias risks. We continue to work with them to ensure these are addressed.
These audits form part of our AI and biometrics strategy and reflect our role as an active regulator committed to ensuring police use of facial recognition technology is balanced, responsible and respectful of people’s rights.
What we’ve learnt
We’ll publish an outcomes report later this year with learnings applicable across all forces.
What’s clear from this work so far is that robust data protection must sit at the heart of all police use of FRT. Forces may use the technology in different ways, but all must fully understand the systems they rely on and anchor their approach in strong governance: clear policies, defined roles and processes to ensure personal data is handled lawfully and securely.
If police forces can’t get their governance right, it’s unlikely they’ll get FRT right.
All forces should also be conducting routine testing for bias and discriminatory outcomes – whether arising from technology design, training data, or watchlist composition. Without this, there is a real risk of unfairness. Our audits also identified opportunities to strengthen staff training and keep FRT policies consistently updated.
Engaging with the Home Office
In December, the Home Office also identified historic bias within the algorithm used for RFR searches on the Police National Database. We are in ongoing dialogue with them about these issues and their wider implications.
Last month we also published our response to the Home Office’s consultation on a new legal framework for biometrics and facial recognition. We recognise that greater legal specificity, including clearly defined, objective requirements and safeguards set out in legislation, could support more consistent police use of these technologies. But any new framework must build on – not replace – existing data protection law.
Building on data protection foundations
Our core message is simple: data protection must remain central to the governance of biometric technologies. It is a useful tool for police, providing essential safeguards, ensuring proportionality, and protecting people’s rights while enabling effective law enforcement. Data protection laws are technology-neutral and designed to interact with other statutory frameworks. We have been clear that any new regime must build on these foundations – not replace them.
This blog post was originally published on the ICO’s website. You can read the original article here.

