Government crackdown on sexually explicit ‘deepfakes’
Predators who create sexually explicit ‘deepfakes’ could face prosecution as part of government plans to protect women and girls.
While it is already illegal to create such images of a child under the age of 18, a new offence will now cover deepfakes of adults.
It comes as the Ministry of Justice says the “proliferation of these hyper-realistic images has grown at an alarming rate, causing devastating harm to victims, particularly women and girls who are often the target”.
A new offence means perpetrators could be charged for both creating and sharing these images.
The Government will also create new offences for the taking of intimate images without consent and the installation of equipment with intent to commit these offences, with perpetrators facing up to two years in jail.
While it is already an offence to share – or threaten to share – an intimate image without consent, it is only an offence to take an image without consent in certain circumstances, such as upskirting.
Victims Minister Alex Davies-Jones said: “It is unacceptable that one in three women have been victims of online abuse. This demeaning and disgusting form of chauvinism must not become normalised, and as part of our Plan for Change we are bearing down on violence against women – whatever form it takes.
“These new offences will help prevent people being victimised online. We are putting offenders on notice – they will face the full force of the law.”
The new offences will be included in the Government’s Crime and Policing Bill, which it says will be introduced “when parliamentary time allows”.
Baroness Jones, Technology Minister, said: “The rise of intimate image abuse is a horrifying trend that exploits victims and perpetuates a toxic online culture. These acts are not just cowardly, they are deeply damaging, particularly for women and girls who are disproportionately targeted.
“With these new measures, we’re sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal. Tech companies need to step up too – platforms hosting this content will face tougher scrutiny and significant penalties.
Campaigner and presenter Jess Davies added: “Intimate-image abuse is a national emergency that is causing significant, long-lasting harm to women and girls who face a total loss of control over their digital footprint, at the hands of online misogyny.
“Women should not have to accept sexual harassment and abuse as a normal part of their online lives, we need urgent action and legislation to better protect women and girls from the mammoth scale of misogyny they are experiencing online.”
The Association of Police and Crime Commissioner’s joint leads for victims have welcomed the government’s plans to make the creation of sexually explicit ‘deepfake’ images of adults a criminal offence, along with the introduction of new offences of taking an intimate image without consent and installing equipment with the intent to commit these offences.
Police and crime commissioners (PCCs) Lisa Townsend and Clare Moody said: “We have seen a proliferation of hyper-realistic sexually explicit ‘deepfake’ images created using AI and shared to cause humiliation, distress and alarm to victims. It is crucial that the law keeps up with technology, and these offences will help tackle new methods perpetrators are using to target their victims.
“The scale of sexual offending against women and girls is vast and it is vital we do all we can to confront and combat it, whilst supporting those who become victims.
“PCCs are committed to providing high quality services that help victims and survivors who need care and support, but we are clear that society as a whole has to recognise the need to tackle a culture that enables violence against women and girls, to make it unacceptable, and to punish those who abuse women whether online or in real life.”
These new offences follow the Government’s action in September last year to add sharing intimate image offences as priority offences under the Online Safety Act. This put the onus on platforms to root out and remove this type of content – or face enforcement action from Ofcom.