UK Law Targets AI Deepfake Creators
The UK government has announced a new law aimed at curbing the creation of AI-generated sexually explicit deep fake images. This legislation, still pending approval, seeks to prosecute offenders with an unlimited fine, regardless of whether they share the images widely or create them with the intent to distress the victim. This move is part of a broader effort to enhance legal protections for women, addressing the growing concern over deepfake technology.
Deepfake technology, which emerged around 2017, allows individuals to create misleading pornographic content by swapping faces in videos without the subjects’ consent. This issue is not confined to the UK. In March, deep fake nudes of female middle school students in Florida led to charges against two boys. The proliferation of open-source image synthesis models since 2022 has prompted regulators in the US and the UK to take action against the creation and dissemination of non-consensual deepfakes.
The UK Ministry of Justice stated that the new offence would criminalise the creation of sexually explicit deep fake images without consent, even if the creator does not intend to share them but aims to cause alarm, humiliation, or distress to the victim. Offenders could face a criminal record and an unlimited fine, and if the images are shared more widely, they could be sentenced to jail. This law builds on last year’s Online Safety Act, which criminalised the sharing of non-consensual deepfake images. The proposed legislation marks the first time that creating such images would be illegal in the UK, as existing laws already cover the creation of sexual deepfakes involving children.
The government is also reinforcing existing laws to allow charges for both the creation and distribution of deep fake content, potentially leading to harsher penalties. Minister for Safeguarding Laura Farris emphasised the government’s stance, describing the creation of deep fake sexual images as despicable and unacceptable, irrespective of whether the image is shared. She stated that the new offence sends a clear message that making such material is immoral, often misogynistic, and a crime.
The potential for misuse of deepfake technology is vast. A notable example involves a video of Gal Gadot, which appears to show her engaging in sexual acts. This video was created with a machine learning algorithm, using accessible materials and open-source code. Although the result is not perfect, it is convincing enough to disturb viewers and showcases the potential for harm. The person behind this video, a Redditor named “deep fakes,” used open-source machine learning tools to create these videos, highlighting the ease with which such content can be produced.
Fake celebrity porn, where images are photoshopped to make it appear as if famous people are posing nude, has been around for years. However, deepfake technology represents a significant advancement in this genre, making it trivially easy to fabricate believable videos of people doing and saying things they never did, including having sex. The creator of these deep fakes, who wishes to remain anonymous, used open-source libraries and a deep learning algorithm to compile celebrities’ faces and swap them onto porn videos.
The implications of this technology are concerning. It is easy to imagine an amateur programmer creating a sex tape of someone they want to harass, using publicly available images. The creator of these deepfakes acknowledged the ethical implications, stating that every technology can be used with bad motivations and likened it to the technology that recreated Paul Walker’s post-mortem performance in Furious 7. The main difference, he noted, is how easy it is for anyone to do this now.
Experts agree that the implications are significant. Artificial intelligence researcher Alex Champandard emphasised the need for a public debate on the ease of faking images and videos. He suggested that researchers develop technology to detect fake videos and help moderate what is real and what is not. Internet policy must also improve to regulate these types of forgeries and harassment.
The UK’s new legislation aims to address this by making it a criminal offence to create sexually explicit deep fake images of adults without consent. This offence will apply to adults as existing laws already cover children. The government is also working on broader measures to protect women and girls from physical, emotional, and online abuse. This includes new criminal offences for taking or recording intimate images without consent and installing equipment to do so.
Minister for Victims and Safeguarding Laura Farris highlighted the government’s commitment to protecting women from abuse. The new offence, introduced through an amendment to the Criminal Justice Bill, reinforces this commitment. The bill also includes measures to punish those who take or record intimate images without consent and for the purpose of causing distress or sexual gratification.
The new law builds on the existing “upskirting” offence and reclassifies violence against women and girls as a national threat, prioritising police response. The first person was recently sentenced under the new Cyberflashing offence, which criminalises sending unsolicited explicit images.
The announcement of the new law has been met with support from campaigners and organisations. Cally Jane Beech, a former Love Island contestant, welcomed the new offence, stating that it strengthens laws around deepfakes to better protect women. She emphasised the need for accountability for those who compromise women’s privacy and dignity.
Deborah Joseph, European Editorial Director of GLAMOUR, also welcomed the plans to amend the Criminal Justice Bill. She noted that a recent survey showed 91% of readers believe deepfake technology poses a threat to women’s safety. She called for continued efforts to ensure women feel safe from this activity.
In 2022, the Sexual Offences Act was amended to extend voyeurism offences to cover non-consensual images of breastfeeding. The government is also introducing a statutory aggravating factor for offenders who cause death through abusive, degrading, or dangerous sexual behaviour. This follows recommendations from the Domestic Homicide Sentencing Review, which aims to better reflect the seriousness of domestic homicide in the sentencing framework.
The new law represents a significant step in addressing the misuse of deepfake technology and protecting victims from its harmful effects. By criminalising the creation of non-consensual deepfake images, the UK government is taking a strong stance against this form of abuse and ensuring that offenders are held accountable for their actions.
for all my daily news and tips on AI, Emerging technologies at the intersection of humans, just sign up for my FREE newsletter at www.robotpigeon.be