Artificial intelligence software spots new child sexual abuse media online

New artificial intelligence software designed to spot child sexual abuse media online could help police catch child abusers. It automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

Mar 1, 2017
By Paul Jacques

New artificial intelligence software designed to spot child sexual abuse media online could help police catch child abusers. It automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

Researchers say spotting newly-produced media online could give law enforcement agencies the fresh evidence they need to find and prosecute offenders – but the sheer volume of activity on peer-to-peer networks makes manual detection virtually impossible.

There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year. Research shows the people who produce child sexual abuse media are often abusers themselves – the US National Center for Missing and Exploited Children found that 16 per cent of the people who possess such media had directly and physically abused children.

A report issued last November by the National Police Chiefs’ Council – Online child sexual abuse images: tackling demand and supply – said that around half a million men in the UK today may have viewed child sexual abuse images online, far larger than previous estimates.

Figures from the children’s charity NSPCC found that in the past five years the number of police recorded offences for obscene publications in the UK has more than doubled, reaching 8,745 in 2015, but chief executive Peter Wanless says they do not know just how big the problem is or how many children are affected because the data only shows “how many images have been found, or how many offenders have been caught”.

Three quarters (650) of contacts to the NSPCC helpline about online issues in 2015/16 related to online sexual abuse – 41 per cent of these contacts were serious enough to result in a referral to an external agency.

The NSPCC says it has seen a 250 per cent increase in Childline counselling sessions about online sexual abuse in the past three years.

The new software will automatically identify new or previously unknown child sexual abuse media using artificial intelligence.

“Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse,” explained Claudia Peersman of Lancaster University’s School of Computing and Communications and lead author of the study published in Digital Investigation.

“And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse.”

The research behind the technology was conducted in the international research project iCOP – Identifying and Catching Originators in P2P Networks – founded by the European Commission Safer Internet Programme by researchers at Lancaster University, the German Research Center for Artificial Intelligence (DFKI), and University College Cork, Ireland.

There are already a number of tools available to help law enforcement agencies monitor peer-to-peer networks for child sexual abuse media, but they usually rely on identifying known media. As a result, these tools are unable to assess the thousands of results they retrieve and cannot spot new media that appear.

The iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media. The new approach combines automatic filename and media analysis techniques in an intelligent filtering module. The software can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

The researchers tested iCOP on real-life cases and law enforcement officers trialled the toolkit. It was highly accurate, with a false positive rate of only 7.9 per cent for images and 4.3 per cent for videos. It was also complementary to the systems and workflows already being used. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to police.

“When I was just starting as a junior researcher interested in

Related News

Select Vacancies

Deputy Chief Constable

Essex Police

Inspectors on Promotion to Chief Inspector

Greater Manchester Police

Police Sergeant Transferee

Merseyside Police

Copyright © 2024 Police Professional