Accountability in action
Advances in the technological capabilities of the police need to be matched by a corresponding increase in transparency, legitimacy and trust, says Professor Fraser Sampson.
In his report to Parliament on the Brixton Disorders of 1981, Lord Scarman said that “discretion is the art of suiting action to circumstance; it is the police [officer’s] daily task”. While many things have, thankfully, changed beyond recognition in policing since those days, one thing that has remained the same is that we continue to live in contradiction. The judicial observation is as apt today, some 40 years on.
The layers and levers of accountability under which the police must operate have matured considerably since the 1980s. Whether arising in the context of surveillance cameras, the retention of DNA profiles or the use of powers to stop and search citizens, developments in accountability mean the police are overseen by a whole range of regulatory, oversight and inspection bodies and we arguably now have one of the most accountable police services in the world.
Developments in biometrics technology – a term which I will take here to include facial recognition capabilities – have moved much faster and mean the police will increasingly have access to cheaper, faster and potentially more reliable tools and techniques. Those tools and techniques will probably use fewer resources and make the police better equipped to perform promptly, reliably and at previously impossible scale, not just in the ‘law enforcement’ element of their role but also in protecting the vulnerable, locating the missing and monitoring the dangerous. There are, however, as many concerns about ensuring that the use of these new capabilities is ethical and acceptable as there are calls for them to be exploited.
Globally there are some widely-recorded misgivings about the police use of technology in this area. Some algorithms used in policing for different purposes across various jurisdictions have been roundly discredited – both in their design and their deployment – and it is probably true that this has eroded public trust, perhaps because they were not sufficiently validated in advance, perhaps because they jumped the gun, and almost certainly because they were not properly explained, consulted upon and used under clearly and carefully defined policies.
The response in some countries has been to ban the use of certain technologies altogether and others are considering denying the police access to new techniques. But singling out the technology is probably too simplistic and misses the point while demonising new methodology is just irrational. Here is why.
As citizens in the UK we enjoy a range of clearly described human rights and fundamental freedoms, most of which carry with them obligations and qualifications. It is beyond doubt that the police must respect and uphold the individual human rights of those they police and from whom they derive their legitimacy in our policing model which is both venerated and cherished. Their duty to uphold the rights and legitimate expectations of the citizen, however, often creates contradictory choices and competing demands which must be balanced in light of all the circumstances of each particular case.
While the human rights arguments against some surveillance practices are well-rehearsed and well-founded there is also a legitimate expectation that the police will use all reasonably available resources and measures to protect citizens from degrading and inhumane treatment or ultimately death. That legitimate expectation is not one of outcome but one of means. And those means are undergoing a literal metamorphosis whereby new technological capabilities will make some ‘old school’ techniques of investigation and prevention not just indefensible but obsolete.
The technical and legal challenges can also pull against each other. Since taking up this role I have met those who want the law changed to ban the use of automated decision-making within police facial databases and at the same time there are ongoing concerns about custody images of unconvicted people being held on police databases despite having been condemned in the High Court in 2012. The fact that effective deletion of the custody images requires automated decision-making shows how technical complexity and public trust may pull in different directions simultaneously. It also illustrate how hard and fast rules can produce unforeseen consequences. How do you legislate for this? One way is by creating a governance and accountability framework that is clear enough to establish relevant standards and regulatory requirements and yet flexible enough to allow the suiting of action to circumstance.
A publicly accountable police service that must balance competing individual and public interests is a hallmark of democracy so, to that extent, this is nothing new. What is new is the scope of technological capability and its potential impact on areas of accountability, legitimacy and trust. And this is where the technical, legal and societal considerations meet in a way that has yet to be fully explored. Yes, we could lobby to ban certain types of technology as has happened in other places. However, professional discretion in policing is a corollary of operational independence, something that is jealously protected within international law. In terms of accountability, professional discretion means being as accountable for a decision not to use an available tactical option as you are for your decision to use one.
Denying the police access to technology not only encroaches on their operational primacy, it also dilutes police accountability because it means retaining responsibility for the decision never to use it being the right one in every given operational setting. As technical capability increases – for the police and criminal actors – anyone presuming that they are best placed to make such a call and to accept the obvious risks of being wrong will need to be very sure of their ground.
People must be able to have confidence in the relevant technology doing what it is supposed to but that means the whole ecosystem that uses surveillance cameras and biometrics, not simply novel offshoots of it. More practically, it also means having equal confidence that the operators of those systems and tools are doing what they are supposed to do too; it means understanding the purposes for which the technology is being used, who authorised it and how they came to their decision that it was lawful and proportionate to do so in each case. And it means having clearly defined, published, accessible and intelligible policies publicly setting out the parameters, policies that will be regularly reviewed in light of experience.
True accountability, or answerability as others have termed it, in policing means the citizen is able to hold decision-makers to account – either directly or through their democratically elected local policing bodies – for their actions, their policies, practices and principles. And, returning to Lord Scarman, to hold decision-makers to account for the exercise of their professional discretion. This is not a soft or easy option and accountability in this sense does not mean bureaucratic blaming; it should not mean waiting for years to get a tepid report or vapid apology. It should mean, in the old parlance of policing, someone gripping if not the rail then at least the issue, taking responsibility for what they are going to do and for whatever meaningful consequences may follow once they have done it.
As the Committee on Standards in Public Life has recently emphasised, all our public services must meet high standards of ethical and lawful practice and those standards are as relevant to the use of artificial intelligence and technology as they are to other areas of complex public service delivery. Reinforcing and renewing the existing governance frameworks is key to balancing the competing issues and risks of the future.
We can argue about the law and we can race ahead with the science but societal acceptability is the ground where the ethical use of surveillance and biometric technology in policing is already being shaped. In contributing to that shaping we would do well to resist placing too much reliance on technology alone, to encourage meaningful public debate and thoughtful leadership, and to reinforce the role of independent oversight and what others have called layered co-governance.
In this way we may have a better chance of ensuring that the advances in technological capability are matched by a corresponding increase in transparency, legitimacy and trust.
I opened with Lord Scarman and will close with another venerated judge, this time from the US. Billings Learned Hand observed: “Life is made up of a series of judgments on insufficient data – if we waited to run down all our doubts it would flow past us.”
There are many areas in which the police cannot allow life to flow past us while we run down all our doubts – terrorism and national security, for example, or serious and organised crime and major civil contingencies such as pandemics. Every day the police and partners face situations that call for judgments on insufficient data and must choose from a range of available proportionate responses in “suiting action to circumstance”.
Technology is expanding that range of responses at dizzying speed and our collective role is surely to support critically, challenge constructively and hold each other to account – conspicuously and meaningfully – against the high standards we have come to expect and deserve.
Professor Fraser Sampson is the Commissioner for the Retention and Use of Biometric Material and Surveillance Camera Commissioner. He has more than 40 years’ experience working in the criminal justice sector, having served as a police officer for 19 years before becoming a solicitor specialising in policing law, conduct and governance. He has a PhD in digital accountability in law enforcement from Sheffield Hallam University.