Police warned to go slow on use of facial recognition

The Information Commissioner’s Office (ICO) has urged the police to hold back its use of live facial recognition (LFR) in public places, until the Government provides new legal guidance for the technology.

Nov 1, 2019
By Tony Thompson

The intervention by Elizabeth Denham, the Information Commissioner, follows September’s High Court ruling that South Wales Police did not breach data protection laws by using facial recognition software, in the first legal case of its kind.

Ms Denham said her opinion differed to the High Court’s in “some areas” and that the outcome of the court hearing should not be seen as a blanket authorisation. She added that the ICO hoped to work with the Home Office and other agencies to create a new statutory code that ensures the technology’s use by law enforcement is legal.

In a blog post published yesterday (October 31) Ms Denham said that the ICO’s investigation into LFR use by the Metropolitan Police Service (MPS) and South Wales Police had raised “serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.”

“We found that the current combination of laws, codes and practices relating to LFR will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents,” she wrote.

The police should “seek to raise the standards [of data protection] beyond those set out in the judgement” in certain situations, she continued, “in order to ensure public confidence in this technology”.

Current ICO regulations stipulate that police must follow current data protection laws during trials and full deployment, and that the use of facial images constitutes “sensitive processing” under this legislation. This applies whether an image produces a match on a watchlist or if it is subsequently deleted.

Data controllers must identify a lawful basis for the use of LFR and data protection laws apply to the whole process – “from consideration about the necessity and proportionality for deployment, the compilation of watchlists, the processing of the biometric data through to the retention and deletion of that data”.

Ms Denham has repeatedly stressed her belief that facial recognition could be unnecessarily intrusive and cause unwarranted police intervention. The ICO believes a new legal code is required to ensure that use of the technology is proportionate and minimises the risk of privacy breaches.

One area of concern relates to the legal basis for retaining large databases of suspects’ photographs, which provide the watch lists to which live surveillance images are matched.  Ms Denham hopes to build on the standards established in the Surveillance Camera Code to create a “statutory and binding code of practice” before facial recognition is more widely deployed.

There have been numerous complaints from lawmakers, human rights groups and the public in the past about how police are using the technology in public spaces, with many arguing that trials are being run covertly and that members of the public covering their faces are assumed to be hiding something.

Big Brother Watch released a report last year claiming the LFR systems being used by the MPS were 98 to 100 per cent inaccurate.

However, much of the British public appears to be less concerned about possible invasions of privacy in catching criminals and terrorists. An ICO survey of 2,202 adults in the UK earlier this year found that 82 per cent said it was “acceptable” for the police to use live facial recognition, with 72 per cent agreeing to its use on a “permanent basis in areas of high crime”.

Related News

Copyright © 2024 Police Professional