Survey finds low public faith in police use of algorithms

Only around one in ten UK adults have faith in the computer algorithms used by police, a new survey has revealed.

Sep 9, 2020
By Tony Thompson

More than half (53 per cent) said they had no faith in any organisation using algorithms to make judgments about them, in issues ranging from education to welfare decisions, according to the poll conducted by YouGov for BCS, The Chartered Institute for IT.

Just 11 per cent of adults had faith in how the police and technology companies such as Apple and Google used algorithms to make decisions about them personally.

The survey was conducted in the wake of the UK exams crisis where an algorithm used to assign grades was scrapped in favour of teachers’ predictions.

Just seven per cent of respondents trusted algorithms to be used by the education sector – joint lowest with social services and the Armed Forces. Confidence in the use of algorithms in education also differed dramatically between the age groups – among 18 to 24-year-olds, 16 per cent trusted their use, while it was only five per cent for over 55-year-olds.

Trust in social media companies’ algorithms to serve content and direct user experience was similar at eight per cent. Automated decision-making had the highest trust when it came to the NHS (17 per cent), followed by financial services (16 per cent) and intelligence agencies (12 per cent), reflecting areas like medical diagnosis, credit scoring and national security.

Older people are less trusting about the general use of algorithms in public life, with 63 per cent of over-55s saying they felt negative about this, compared with 42 per cent of 18 to 24-year-olds. Attitudes to computerised decisions in the NHS, private health care and local councils differ very strongly by age. Thirty per cent of 18 to 24-year-olds said they trusted the use of algorithms in these sectors, while for those over 55, it was 14 per cent.

More than 2,000 people responded to the survey. All were shown a description of algorithms before answering any questions.

Dr Bill Mitchell, Director of Policy at BCS, said: “People don’t trust algorithms to do the right thing by them – but there is little understanding of how deeply they are embedded in our everyday life.

“People get that Netflix and like the use algorithms to offer up film choices, but they might not realise that more and more algorithms decide whether we’ll be offered a job interview, or by our employers to decide whether we’re working hard enough, or even whether we might be a suspicious person needing to be monitored by security services.

“The problem government and business face are balancing people’s expectations of instant decisions, on something like credit for a sofa, with fairness and accounting for the individual, when it comes to life-changing moments like receiving exam grades.

“That’s why we need a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgments about people’s lives, and a better understanding of artificial intelligence and algorithms by the policymakers who give them sign-off.”

Related News

Copyright © 2024 Police Professional