Sinister development?
The Biometrics and Surveillance Camera Commissioner Professor Fraser Sampson reflects on the publication of the College of Policing APP on live facial recognition.
Whether it is in our streets, supermarkets or (heaven forfend) our schools, how to deal with live facial recognition (LFR) is the surveillance question that won’t go away.
I was therefore pleased to see the publication of the College of Policing Authorised Professional Practice (APP) on Live Facial Recognition, which sets out a commitment to ‘lawful and ethical’ use of this technology.
Being guided by lawful and ethical considerations will be critical if we are to address, for example, the horrifying prospect of state-owned surveillance companies supplying our police and schools with the facial recognition technology that they are using to perpetuate genocide and human rights atrocities in other parts of the world.
I do however have some concerns and questions about the published APP. For example:
- The apparent intention to use LFR technology to find ‘potential witnesses’ is not the digital equivalent of placing a triangle board on the street to ask anyone passing if they saw anything at a given time and date which they would like to share with the police. Generally speaking, a police witness is someone who has indicated their willingness to take part in the criminal justice process – in which case you do not need a camera to identify them for you; you already know who they are (and, if you do not, why would you have a ‘library’ image of them to compare against a crowd when searching for them?). If this envisages tracking people and approaching them to confirm whether they were at a certain place on that date and then ‘inviting’ them to disclose what they heard and saw solely because someone’s surveillance system thinks they were present, that is a new and somewhat sinister development, which potentially treats everyone like walk-on extras on a police film set rather than as individual citizens free to travel, meet and talk. I think the speculative use of LFR in this way would call its legitimacy and proportionality into question. I can understand that there may be some exceptional, very high harm events such as terrorist attacks or natural disasters where retrospective facial recognition might legitimately make a significant contribution to an understanding of what happened, but those events would be mercifully rare and wholly exceptional. Making effective provision for exceptional events calls for very careful drafting if the exception to the rule is not to become a catch-all boilerplate clause covering every unspecified eventuality.
- The terminology and definitions of different types of biometric and forensic search methods raise further questions. For example, LFR and retrospective facial recognition invite questions about the relevant training, certification and accreditation standards. What is the fundamental difference between an LFR search, a mass screening and a forensic database search? Are these to be clarified with the new Forensic Science Regulator? This goes beyond a glossary and is important in public understanding of the APP and its wider implications.
- Representative testing methodologies for example the ‘Blue Watchlist’. A major and enduring challenge for British policing is the fact that minority ethnic populations continue to be under-represented in policing in light of which using existing personnel to test the LFR system already runs the risk of introducing imbalance and an increased risk of demographic differentials, not just in the software development but also in the human adjudication process.
- LFR and counter terrorism – while not mentioned specifically, the alignment between LFR with the principles and standards set out in the UN Compendium needs to be clarified. Jean Charles de Menezes was tragically shot dead by counter-terrorism police in London because he had been facially misidentified by a surveillance officer. If we were to rely on LFR in these extreme circumstances in the future what are the safeguards? Is there a case for judicial approval for deploying LFR rather than a senior police officer as is the case for other types of surveillance? What about the exchange of image templates from LFR across jurisdictions, for example, where the technology is used for journeys via the channel tunnel? Perhaps the DCMS (Department for Digital, Culture, Media and Sport) consultation on the structure for biometric surveillance oversight and regulation should address this.
- The focus of the APP is data-rights driven whereas the overall direction in police surveillance, coupled with the acute public sensitivity to some technology, extends far beyond keeping data safe. Rather than treating this area as purely a matter for ‘data rights’ compliance the framework for maintaining public trust and confidence in police surveillance should focus more on the much wider impact on society. For example, the ‘chilling effect’ of biometric surveillance by the police has been well documented both in academic research and in the courts – if people decide not to travel, not to meet, not even to talk openly because of their concerns that where they go, what they do and say is being monitored by the police, that is a fundamental constitutional consequence of intrusive policing activity; and it has nothing to do with data protection. Perhaps the DCMS consultation should address this too.
In summary – in moving from a standard police operating model of humans looking for other humans in a crowd to the automated industrialised process of LFR (as some have characterised it, a move from line fishing to deep ocean trawling), how commonplace will it become to be stopped in our cities, transport hubs, outside arenas or school grounds and required to prove our identity? The ramifications for our constitutional freedoms in that future are profound. Is the status of the UK citizen shifting from our jealously guarded presumption of innocence to that of ‘suspected until we have proved our identity to the satisfaction of the examining officer’? If so, that will require more than an APP from the College of Policing: it will require parliamentary debate.
I am keen to continue open, informed dialogue with stakeholders who have an interest in this area, from the avid supporters to the anti-surveillance campaigners and everyone in between. The proper role of technology in surveillance calls for balance, not only of what is possible against what is lawful, but increasingly alongside what we find acceptable or even tolerable. Societal acceptability is the ground where the accountable, ethical and legitimate use of surveillance technology is being shaped. That again is surely a matter for Parliament.
To achieve a greater understanding of the societal acceptability of facial recognition technology by the police, my office is planning to put ‘Facial Recognition on Trial’. In conjunction with Professor William Webster (Centre for Research into Information, Surveillance and Privacy) the event will contribute to a key objective under the Civil Engagement strand of the National Surveillance Camera Strategy. The event will take place before a live audience and will imitate a court trial with evidence provided by expert witnesses and members of the public acting as a jury. The mock trial will be held on June 14 at the London School of Economics with tickets available to book soon.
My website will continue to be updated as further details emerge.
This blog first appeared on the website of the Biometrics and Surveillance Camera Commissioner – https://www.gov.uk/government/news/the-biometrics-and-surveillance-camera-commissioners-response-to-the-college-of-policing-app-on-live-facial-recognition