Flaws in recognition technology could ‘change face of policing’, claims campaign launch group

Forces trialling the ‘new DNA’ in law enforcement have promised “full and overt” evaluations later in the year as campaigners demand policing immediately halts using automated facial recognition (AFR) technology.

May 16, 2018
By Nick Hudson

Civil liberty group Big Brother Watch launched a high-profile campaign on Tuesday (May 15), backed by senior Labour Home Office political figures, after publishing a report claiming its investigation results showed police data matches to be “almost entirely inaccurate”.

On average, some 95 per cent of ‘matches’ wrongly identified innocent people with Britain’s biggest police force returning the worst record – with false positives in more than 98 per cent of alerts generated.

Out of 104 alerts, the Metropolitan Police Service (MPS) only correctly identified two people using the technology – neither of which was a wanted criminal.

The report claims AFR technology is being used by UK forces “without a clear legal basis, oversight or governmental strategy, despite its potential to infringe on civil liberties and fundamental rights”.

In calling for the Home Office to remove the thousands of images of unconvicted individuals from the Police National Database, Big Brother Watch director Silkie Carlo said: “Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK.

“Members of the public could be tracked, located and identified – or misidentified – everywhere they go.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.”

The pressure group head added: “It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.

“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

Big Brother Watch submitted Freedom of Information requests to every police force in the UK. Two – the MPS and South Wales Police – acknowledged they were currently using AFR cameras.

The MPS’s figures showed less than two per cent accuracy with more than 98 per cent of matches wrongly identifying innocent members of the public.

The system used by SWP has returned more than 2,400 false positives in 15 deployments since June 2017. The vast majority of those came during that month’s Uefa Champion’s League final in Cardiff, and overall only 234 alerts – nine per cent – were correct matches.

Some 0.005 per cent of its matches led to arrests, which numbered 15 in total. The force has stored biometric photos of all 2,451 people wrongly identified by the system for 12 months but said the images were only stored as part of an academic evaluation, and not for any policing purpose.

The report found that out of the 35 forces that responded to the FoI request, not one was able to say how many photos are being held of potentially innocent people in their custody image databases.

In response, the MPS, which is planning seven more AFR deployments this year, said it is only trialling facial recognition technology, and it was used at the previous two Notting Hill Carnivals and the 2017 Remembrance Sunday service “to assess if it could assist police in identifying known offenders in large events, in order to protect the wider public”.

The force told Police Professional it did not consider there was an issue over ‘false’ positive matches as “additional checks and balances are in place to confirm identification following system alerts”.

All alerts against the watchlist are deleted after 30 days and faces in the video stream that do not generate an alert are deleted immediately.

The MPS statement added: “All our deployments during the trial have been and will be overt, with information disseminated to the public, and will be subject to full evaluation at the conclusion of the trial which is expected to be in around late 2018.

“Whilst we are trialling this technology we have engaged with the Mayor’s Office for Policing and Crime (MOPAC) Ethics Panel, Home Office Biometrics and Forensics Ethics panel Surveillance Camera Commissioner, the Information Commissioner, the Biometrics Commissioners, and Big Brother Watch. Liberty were invited to observe its use at the carnival last year.

“There have been no arrests resulting from the use of facial recognition technology.”

SWP Chief Constable Matt Jukes pointed to the “reality” of the technology assisting at major sporting events in crowded places that are potential terror targets, adding: “We don’t take the use of it lightly and we are being really serious about making sure it is accurate.”

On the workings of AFR, a SWP spokesperson added that it is a fairly straightforward response for an officer to “quickly establish if the person has been correctly or incorrectly matched by traditional policing methods”. 

AFR has turned into a battleground – appearing to be the next big leap for policing while privacy campaigners think it is the next big assault on civil liberties. The Big Brother Watch report suggests the benefits of new tech are missing because it does not work.

The pressure group’s call for an immediate stop to the use of automated facial recognition software with surveillance cameras is backed by Tottenham MP David Lammy and 15 campaign groups including the Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation, and Runnymede Trust.

UK Biometrics Commissioner Professor Paul Wiles said legislation to govern the technology was urgently needed.

Related News

Select Vacancies

Constables on Promotion to Sergeant

Greater Manchester Police

Copyright © 2024 Police Professional