‘Unethical’ not to use LFR as a force for good, says former Surveillance Camera Commissioner
The former Surveillance Camera Commissioner has questioned suggestions that the use of live facial recognition (LFR) in public should be suspended until a new statutory framework and code of practice is in place.
The recommendation was among ten made in the recently published Ryder Review on the governance of biometric data and LFR in England and Wales.
But while Tony Porter agreed it was “unethical to use LFR badly”, he argued: “At the same time, I believe it is unethical not to use it when it can save lives, reduce heavily pressed law enforcement resources, protect children from traffickers and act as a force for good.
“Legislation can take years to enact. Introducing a strong regulator, that is already in existence and highly regarded can be introduced tomorrow.”
The independent legal review was commissioned by the Ada Lovelace Institute and led by Matthew Ryder QC after the Commons Science and Technology Select Committee had called for “an independent review of options for the use and retention of biometric data” in 2020.
In his foreword to the review, Mr Ryder said the increasing use of LFR was perhaps the “clearest example of why a better legal and regulatory framework for biometric data is needed urgently”.
“But LFR is merely the technology that has the most focus currently,” he said. “The concerns it raises apply in numerous other areas.
“As we have set out in the review, a new regulatory framework must be applicable to a range of biometric technologies, rather than simply react in a piecemeal way to each new development.
“We found such research to be significantly lacking, due to the particular focus thus far on biometric data use by public authorities, particularly LFR by law enforcement.”
Mr Ryder said a “key concern”, both before and after the review was commissioned, was police use of LFR technology.
It received considerable public attention, following the Metropolitan Police Service’s deployment of LFR at Notting Hill Carnival in 2017 and South Wales Police’s piloting of the same technology in 2017/18.
In 2019, the Biometrics and Forensics Ethics Group noted the lack of independent oversight and governance of LFR and, in 2019 and 2020, the Divisional Court and Court of Appeal gave judgments on the lawfulness of the South Wales deployments, with the Court of Appeal finding that there was an “insufficient legal framework” around the deployment of LFR to ensure compliance with human rights.
The review noted that public and legal concerns around LFR have not diminished, but have increased substantially during the course of the inquiry.
“As recently as October 2021 the European Parliament voted overwhelmingly in favour of a resolution calling for a ban on the use of facial recognition technology in public places,” it added.
The need for an independent legal review of the governance of biometric data was prompted by the increasing use of technologies that capture, analyse and compare biometric data by the police, public authorities and private companies in a range of settings, from public spaces to workplaces.
Growth in machine learning, camera and sensor technologies has led to increasingly widespread use of biometric data (metrics related to human physiological or behavioural characteristics) to identify or authenticate individuals or classes of people, or to measure and categorise their behaviour, said the review.
It noted that the range of biometric data that may be recognised or processed includes facial characteristics, fingerprints, iris prints, hand or footprints, gait and DNA.
“Recent legislative changes around the use and processing of personal data – including the EU General Data Protection Regulation (GDPR), which is mirrored by the UK General Data Protection Regulation, and the forthcoming EU AI Draft Regulation (the ‘AI Act’) – have not brought sufficient clarity to the regulation of biometric technologies,” said the review.
“There remains uncertainty as to when, if at all, techniques such as LFR can be used in accordance with the law, and how the use of biometric data should be regulated.”
Mr Ryder said the review sought to “address that uncertainty” by assessing the existing legal and regulatory framework and by making ten recommendations to protect fundamental rights (particularly data and privacy rights), including:
- A new, technologically neutral, statutory framework which sets out the process that must be followed, and considerations that must be taken into account, by public and private bodies before biometric technology can be deployed against members of the public;
- Legislation that covers the use of biometrics for unique identification of individuals, and for categorisation (also known as classification);
- Sector and/or technology-specific codes of practice, including, as soon as possible a legally binding code of practice governing the use of LFR. The use of LFR in public should be suspended until the statutory framework described above is in place. This framework should supplement, and not replace, existing duties under the Human Rights Act 1998, Equality Act 2010 and Data Protection Act 2018; and
- The establishment of a national Biometrics Ethics Board with statutory advisory role in respect of public-sector biometrics use.
The review references a number of recommendations made by Mr Porter while he was Commissioner, including for law enforcement, authorising officers and separate ethics panel.
Mr Porter, now chief privacy officer at Corsight AI, commented: “Matthew Ryder QC and his team have produced a well-balanced and thought-provoking report relating to the governance of biometric data in England and Wales. There are a number of recommendations included in this report that will be crucial to the better understanding and ethical use of LFR moving forward – particularly the call for a new legal framework.
“While Bridges v South Wales Police recognised that there exists a framework that supports lawful operation of LFR, the court found that its application was complex and requires simplification. Ryder’s report helpfully supports this proposal with a further recommendation that regulation be simplified to provide clarity.
“Currently the Information Commissioner’s Office (ICO), Biometric and Surveillance Camera Commissioner, Investigatory Powers Commissioner’s Office and others have a finger in this pie. I believe that this role should be performed by a dedicated specialist.”
However, he added: “Where I take issue with Ryder is on recommendation 5: the use of LFR in any circumstance should be suspended until a new statutory framework and code of practice is in place. This recommendation references, amongst other arguments, the prohibition within several states in US as evidence that we need to reconsider our position in the UK.
“We are seeing several of those states retract those prohibitions (Washington, Virginia and New York). The evidence of the use of this technology being seen as a force for good is compelling. Greater confidence across the pond will also be supported by the release of the draft Federal Bill (American Data and Privacy Act). This will support the US in emerging from a 53 state approach to regulation to a harmonised approach at Federal level.
“Even the EU AI draft Act (April 2021) recognises the many circumstances where LFR will save lives and ought to be used. We see the emergence of a database supporting Interpol across the 27 European States. We also see the UK loosening its GDPR shackles with proposals to amend UK data legislation.”
The Biometrics and Surveillance Camera Commissioner is currently gathering the latest information from all police forces under his jurisdiction on their use of overt surveillance camera systems.
Professor Fraser Sampson, has written to the chief officers of all 43 geographical forces in England and Wales, the Ministry of Defence, British Transport Police and the Civil Nuclear Constabulary, asking for details of their use and governance of all overt surveillance camera systems deployed in public places.
The survey covers all facial recognition enabled systems, drone mounted camera systems, helicopters or aeroplane mounted systems, body-worn video cameras, ANPR (automated number plate recognition) systems and any other surveillance camera systems in public places that fall within the definition of section 29(6) of the Protection of Freedoms Act 2012.
The survey asks about the capabilities of systems, whether they use equipment from non-UK suppliers about which there have been ethical or security concerns, what due diligence they have undertaken to ensure they are working with trusted partners, and how their systems comply with the Home Secretary’s Surveillance Camera Code which they have a legal duty to observe.
About facial recognition in particular, Professor Sampson’s survey asks forces whether they currently use this technology and, if so, whether it is live (real-time) or retrospective, and whether it is initiated by officers using cameras on their mobile phones or some other kind of system. If none is currently in use, the survey asks whether the force intends to start using facial recognition technology in the future.
Professor Sampson said: “There is little doubt that the police use of surveillance camera systems in the public sphere has been increasing in recent years.
“This survey will provide an important snapshot of what kinds of overt surveillance camera systems police are using, what they are being used for, and the extent to which facial recognition technology is now being used. It should also tell us whether police forces are complying with the new Surveillance Camera Code as they should be.
“It will be very interesting to see how much things have changed since similar surveys were conducted in 2017 and 2019 by my predecessor in the role of Surveillance Camera Commissioner.”
The Government’s revised Surveillance Camera Code of Practice came into force in January this year and emphasises the importance of the legitimate use of technology “to a standard that maintains public trust and confidence”.