Online Safety Bill must ‘go further’ to stop child sexual abuse, says NSPCC
Ahead of today’s second reading of the ‘historic’ Online Safety Bill, the NSPCC is urging the Government to “go further” to better tackle grooming and child abuse.
Analysis by the children’s charity shows that reports of online child sexual abuse to police have risen by more than a quarter (27 per cent) since 2018 when ministers first promised to bring in new laws to protect children on social media.
While its new report, Time to Act, published on Friday (April 15) “commends the Government for bringing forward this legislation”, the NSPCC says “key changes” are needed to convert “ministers’ ambition to protect children into a world-leading response to the online child sexual abuse threat”.
In particular, it wants to see a clampdown on so-called ‘breadcrumbing’, where offenders use social media to form networks, advertise a sexual interest in children and signpost to child abuse content hosted on other sites.
The report also calls for private messaging to be brought more into focus by giving regulator Ofcom the power to compel firms to use technology that can identify child abuse and grooming in private messages, such as Photo DNA, which is commonly used by virtually all major sites.
And the NSPCC says more should be done to tackle cross-platform abuse and disrupt how offenders groom children on multiple social media sites and games. One way is to make companies “legally cooperate with each other” to disrupt grooming, it suggests. suggesting sites should be made to cooperate with each other by law in order to disrupt grooming.
The NSPCC says amending the Bill to fight the ways offenders facilitate abuse online could “prevent millions of interactions with accounts that contribute to child sexual abuse”.
Peter Wanless, chief executive officer of the NSPCC, said: “This historic Online Safety Bill can finally force tech companies to systemically protect children from avoidable harm.
“With online child abuse crimes at record levels and the public rightly demanding action, it’s crucial this piece of legislation stands as a key pillar of the child protection system for decades to come.
“The NSPCC report sets out simple but targeted solutions for the Bill to be improved to stop preventable child sexual abuse and to finally call time on years of industry failure.”
The Online Safety Bill will require social media platforms, search engines and other apps and websites allowing people to post content to improve the way they protect their users.
Among the new measures included in the Bill are tougher and quicker criminal sanctions for technology bosses who are deemed to have failed to protect users from harmful material.
Another new requirement will mean companies must report child sexual exploitation and abuse content they detect on their platforms to the National Crime Agency.
The regulator Ofcom will have the power to fine companies failing to comply with the laws up to ten per cent of their annual global turnover, force them to improve their practices and block non-compliant sites. Crucially, the laws have strong measures to safeguard children from harmful content such as pornography and child sexual abuse, says the Government.
The Internet Watch Foundation has described the Bill as a “once in a generation chance to make sure children are not collateral damage to the growing power of the internet”.
Digital Secretary Nadine Dorries said: “The time has come to properly protect people online and this week MPs will debate the most important legislation in the internet age.
“Our groundbreaking Online Safety Bill will make the UK the safest place to surf the web. It has been significantly strengthened following a lengthy period of engagement with people in politics, wider society and industry.”
The Football Association (FA) says the legislation is a “promising first step towards creating a new era of accountability online”.
In a statement ahead of Tuesday’s (April 19) second reading of the Bill by MPs, it said: “Online abuse is a serious issue in football, from the grassroots and throughout the professional game, and we commend the Government for bringing forward this world-first legislation to create a safer online environment and hold social media companies to account.
“We are pleased that the Government has listened to our views and strengthened the Bill in a number of important areas, including:
- The designation of hate crime as priority illegal content on the face of the Bill, which means that platforms have an active duty to minimise exposure to such content; and
- The new anonymity provisions, which mean that ID verification must be offered as an option and users will have greater control over who can contact them and what they see online.
“We are also pleased that the Government has accepted the Law Commission’s recommendations to reform the communications offences to include threatening and harassing behaviour online more clearly.”
The FA added: “We are pleased that the Government recognises the importance of transparency and would urge ministers to ensure that the Bill sets out minimum levels and categories of information that will need to be provided each year as part of the transparency reporting requirements; and that Ofcom has the ability to share reporting information.
“Although hate and discrimination are not currently the subject of a specific code of practice, we are keen to work closely with the Government and Ofcom on this very important issue. We want to ensure that experiences and voices of victims of online abuse provide critical insight and influence the creation of Ofcom’s guidelines.
“We would like to thank the Government for engaging with English football during the drafting of this important piece of legislation and we look forward to working with them closely as the Bill progresses through Parliament over the coming months.”
The Government has also launched the next phase of its Online Media Literacy Strategy. It aims to help vulnerable and ‘hard-to-reach’ people, such as those who are digitally excluded or from lower socio-economic backgrounds, navigate the internet safely and teach them to “spot falsities online”.
Funding of £2.5 million from the Department for Digital, Culture, Media and Sport will be used to provide training, research and expert advice. This includes the creation of a new Media Literacy Taskforce with 18 experts from a range of relevant organisations, including Meta, TikTok, Google, Twitter, Ofcom and the Telegraph, as well as universities and charities.
“We want to arm everyone with the skills to navigate the internet safely,” said Ms Dorries.