Police machine learning may have ‘unintended consequences that are difficult to anticipate’

A new regulatory framework is needed to overcome the “ethical challenges” police machine learning technology faces, according to a Royal United Services Institute (RUSI) study

Sep 24, 2018
By Serena Lander
Alexander Babuta - RUSI

The defence and security think tank said a “lack of clear guidance and codes of practice” governing how police forces should trial predictive algorithmic tools should be a “matter of urgency”..  

Machine Learning Algorithms and Police Decision-Making: Legal, Ethical and Regulatory Challenges, published on Friday (September 21) by RUSI in association with the University of Winchester, explained that while limited, police are now trying to use machine learning algorithms to determine predictions of individuals’ proclivity for future criminal behaviour. 

However, due to the lack of guidance on the subject, the report argues that police cannot trial the technology while being confident it abides by data protection legislation and a respect for human rights. 

This leaves room for “unintended or indirect consequences that are difficult to anticipate”, it says. 

The report is based mostly on proceedings of a half-day conference and a series of focus groups held at the University. 

Machine learning algorithms require continuous attention to ensure a lack of bias. This is also difficult while officers do not yet have the skill sets to understand, deploy and interpret machine learning. Therefore, sufficient training is necessary to ensure use of the technology is ethical.  

In the last ten years predictive policing algorithms have been in use to identify geospatial locations that are most at risk of experiencing crime, this helps pre-emptively deploy resources to where they are most needed.  

However, the report focused on the more recent development involving decision-making relating to individuals.  

Durham Constabulary is to become the first police force in the UK to deploy algorithms to assess risks of individuals reoffending. 

The technology has been an area of controversy for some time, particularly with respects to racial discrimination. While existing models do not factor in race, postcodes are included which can function as a proxy variable for race or community deprivation. This may have an indirect and undue influence on the outcome prediction, RUSI claims. 

The Metropolitan Police Service’s ‘Gang Matrix’ came under fire from Amnesty International for being racially discriminatory and “counterproductive”.  

However, the report put forward that there is an argument that authorities have a social obligation to trial new technological methods out of their “duty is to prevent crime, arrest offenders and protect victims”.  

Consequently it “is essential that such experimental innovation is conducted within the bounds of a clear policy framework, and that there are sufficient regulatory and oversight mechanisms in place to ensure fair and legal use of technologies within a live policing environment”. 

The report put forward a number of recommendations. For instance, it called for the Home Office to develop codes of practice outlining constraints which would govern how police forces should trial predictive policing tools. This must be done before any large-scale deployment occurs.  

Other recommendations include: 

  • The College of Policing should develop guidance within the Authorised Professional Practice with respect to the deployment of a machine learning algorithm within a decision-making process. This should include guidance on how police forces should present algorithmic predictions to those about whom the prediction is made. A clear process for resolving disagreements when professional judgement and the algorithm come to different conclusions should also be established within this guidance.  
  • The inspection role of Her Majesty’s Inspectorate of Constabulary and Fire and Rescue Services (HMICFRS) should be expanded to include assessment of forces’ compliance with the above-mentioned new guidance. This will provide an accountability mechanism to ensure police forces are developing new tools in accordance with relevant legislation and ethical principles.  
  • Further research is needed to determine how the introduction of algorithmic tools influences police officer behaviour and the decision-making process as a whole. 

Marion Oswald from the University of Winchester and Alexander Babuta from RUSI will be presenting their findings in detail at the Excellence in Policing Conference this week.

Related News

Select Vacancies

Constables on Promotion to Sergeant

Greater Manchester Police

Copyright © 2024 Police Professional