Legal challenge into Scotland Yard's use of facial recognition technology

A judicial review to investigate controversial use of automated facial recognition (AFR) technology has been called for by a campaign group.

Jul 25, 2018
By Serena Lander
Credit: Digital Barriers

The Metropolitan Police Service (MPS) confirmed today that it intends to use the technology again in East London this week despite any legal challenges.

Big Brother Watch (BBW) and Baroness Jenny Jones have issued proceedings in the High Court against MPS Commissioner, Cressida Dick and Home Secretary Sajid Javid arguing that the use of AFR is contrary to Articles 8, 10 and 11 of the European Convention on Human Rights.

The crowdfunded legal challenge which began yesterday (July 24) wants a full judicial review of the facial recognition technology that is currently being used in London by the MPS.

The technology will be used on Thursday (July 26) for the second time in Stratford, East London. It was first deployed in this location last month in an effort to reduce rising violent crime in the area. It has also been used at Notting Hill Carnival and Remembrance Sunday, but these deployments have been discontinued.

Anyone who declines to be scanned during the deployment will not be viewed as suspicious by police officers.

The MPS has said information leaflets were provided to the public to be consistent with the rest of the research into its potential wider use and that further checks are made to ensure the identity is always a match.

The same will happen on Thursday, says a spokesperson for the MPS.

However, a report by BBW on the technology quoted Lord Bishop of St Albans, Alan Smith who expressed concern during the first parliamentary debate on the topic: “I have taken the trouble to talk to a number of people over the last week to ask them of their awareness of this technology. I was very struck by the fact that hardly anybody I spoke to realised what was already going on. Some were horrified, some were puzzled and every one of them had questions and worries.”

Detective Superintendent Bernie Galopin said the MPS has been trialling AFR at ten different events and locations before a full evaluation at the end of this year.

South Wales Police has led the facial recognition development in policing. It scanned more than 170,000 individuals who were in the Welsh capital for a Champions League football match in June 2017.

It became the first force to have a positive match using the technology and it led to the arrest of a 34-year-old man who was wanted on a recall to prison.

However, BBW claims it has obtained police figures revealing that 98 per cent of the MPS’s facial recognition “matches” wrongly identified innocent people.

In addition, its investigation has said to have revealed that even when innocent people are wrongly “matched” the police store biometric photos of the individuals for up to a year without their knowledge.

The campaign group raised more than £5,000 on the crowdfunding website CrowdJustice to begin the legal process.

Commander Ivan Balhatchet of the MPS said: “We believe it will be an extremely valuable tool to help keep London and its citizens safe, alongside other tactical methods we use. The public would rightly expect our use of this technology to be rigorously scrutinised and used lawfully.

“A comprehensive legal framework, which supports the Met’s use of facial recognition technology, is due to be published on the force’s website in the next couple of weeks. The new webpage will also provide information about why the Met is trialling the technology, where and when it has been used and how we will engage with Londoners during the deployments. Police officers stationed on the facial recognition deployments, which are all overt and advertised, are also available to answer any questions from members of the public about how the technology is used.

“The locations of the trials change to test the technology in a variety of different scenarios. Deployments do not focus on particular communities.

“Representatives from the Mayor’s Office for Policing and Crime (MOPAC) and the London Ethics Panel have been invited to observe the next deployment of the technology.”

Rosa Curling, solicitor at Leigh Day, representing both claimants said: “Not only has the public raised concerns about the use of AFR but so too has the Information Commissioner and the Biometrics Commissioner. The latter has said that automated facial recognition is “very intrusive of personal freedom” and that the “lack of governance is leaving a worrying vacuum.”

“The Home Secretary has failed to show that the use of AFR is either proportionate or necessary in our democratic society. Our clients hope the issuing of proceedings will result in an immediate halt of its use by the police and reconsideration by both the police and Home Office as to whether it is suitable to use in the future.”

Related News

Copyright © 2024 Police Professional