Maria Miller MP has been granted an Adjournment debate in the House of Commons on Thursday 2 December and will be calling on the Government to criminalise the making and sharing of non consensual ‘deepfake’ pornographic images and the use of ‘nudification’ software to create non consensual nude images of women.
‘Deepfakes’ are pornographic images that are created by merging existing pornographic content with the image of an individual, usually a woman, who has not given their consent. The resulting pornographic material is often violent, including illegal depictions of rape. ‘Nudification’ software takes everyday images of women and without their consent uses an extensive database of pornographic images to create a new image which makes it appear as if the original subject of the photo is nude.
Maria Miller MP said:
“The decision to create and share a deepfake or a nudified image is a highly sexualised act, undertaken without the consent of person involved, most often a woman. The impact on the victim can be devastating, with images difficult to remove when posted online.
“I want to see the law changed to make it a sex crime to create and distribute deepfake images and also to use nudification software to create nude images.
Deep fake and nudification software are yet more ways women can suffer online sexual abuse. Women in this country have faced a growing problem of image-based sexual abuse over the past decade but the scale of the problem is increasing, with reports of online image based abuse increasing by 87% in 2020.
Deepfakes and nudified images are another vivid form of violence against women online. Let’s be clear: this non-consensual use technology is almost exclusively used against women. Offline, non-consensual sexual acts are recognised in law through the crimes of sexual assault, sexual abuse and rape. The Government has to put in place laws that recognise technology and artificial intelligence is being used to inflict sexual attacks and violence on women and girls. We cannot allow the online world to be a continuum of the offline world, where women and girls experience even further new forms of sexual abuse and violence. This is why we need new law to criminalise the taking, making and sharing of nude and sexual images without consent and this includes deepfakes and nudification.”
Maria Miller MP has long advocated for the law to be updated on image-based abuse, having successfully campaigned to outlaw ‘revenge pornography’ in 2015, when she was contacted by one of her constituents who was a victim of 'revenge porn'. Maria is also running a campaign along with Grazia UK calling for Cyberflashing (or indecent exposure online) to be criminalised.
Since emerging in late 2017, the phenomenon of deepfakes has developed rapidly, both in terms of technological sophistication and societal impact. Research groups like AI Sensity have observed that while early deepfake technology clearly appear artificial, technological advances have now meant that manipulated images increasingly real and imperceptible from reality.
The AI research group Deeptrace has found that the total number of deepfake videos online is rapidly increasing, with an almost 100% increase from 2018 to 2019. Its research also found that 96% of deepfakes are pornographic in nature, and exclusively target women.
The technology has also become more easily available to the public in recent years through ‘nudification’ apps like DeepSubeke, which received five million visits in the month of June 2021 alone. This service allows users to undress women in photos using Artificial intelligence (AI), and has spread rapidly on social media.
Deepfakes are widely regarded by academics as the future of violence against women online, yet the existing law is largely redundant, as the 2015 Act (so-called ‘revenge porn legislation’) specifically excludes altered/photoshopped images and videos.
Devastating lives: real life experiences of deepfake image abuse
The taking, making and sharing of intimate images without consent devastates lives. In 2020, cases to the Revenge Porn Helpline increased by 87% in 2020 alone. In 2021, cases increased a further 24% on top of the incline seen in the previous years.
In 2019, Helen discovered that non-sexual images of her had been uploaded to a porn website, where users of the site were invited to merge Helen's face with explicit, violent and illegal sexual images and videos. Helen was only alerted to the existence of the photos by an acquaintance, after years of their circulation online.
The original images were taken from her social media, including photos from her pregnancy. She said some images were clearly manipulated, but others were "chilling" and still experiences nightmares about them. Speaking of the experience, Helen said:
"Obviously, the underlying feeling was shock and actually I initially felt quite ashamed, as if I'd done something wrong. That was quite a difficult thing to overcome. And then for a while I got incredibly anxious about even leaving the house."
Helen alerted the police to the images but was told that no action could be taken.
Alana [not her real name], also had faked images widely circulated non-consensually:
“It has the power to ruin your life, and is an absolute nightmare, and it is such a level of violation that is exemplified because you are violated not only by the perpetrator but you also feel violated because society doesn't recognise your harm. It is a very isolating experience, it is a very degrading and demeaning experience, it is life-ruining.”