Connect with us

Hi, what are you looking for?

Top Stories

Victims Demand Stronger Protections as UK Criminalizes Non-Consensual Deepfakes

Victims of deepfake abuse in the UK demand stronger protections as a new law criminalizes non-consensual explicit images, following a petition with 73,000 signatures.

Victims of deepfake image abuse in the UK have intensified their calls for stronger protections against **AI-generated explicit images** following the recent enforcement of a law criminalizing the creation of **non-consensual intimate images**. Campaigners from **Stop Image-Based Abuse** delivered a petition to **Downing Street** containing over **73,000 signatures**, urging the government to introduce civil justice routes, such as takedown orders for abusive imagery on various platforms.

“Today’s a really momentous day,” said a victim of deepfake abuse, who identified herself as Jodie, using a pseudonym. “We’re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it,” she added.

Among the demands in the petition are calls for improved relationships and sex education, alongside adequate funding for specialist services like the **Revenge Porn Helpline**, which assists victims of intimate image abuse. Jodie, in her 20s, discovered deepfake pornography featuring her in **2021**. She, alongside 15 other women, testified against **Alex Woolf**, a **26-year-old** convicted of posting images of women from social media to pornographic websites, resulting in a sentence of **20 weeks in prison**.

“I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” Jodie remarked, reflecting on her experience. The new offence regarding the creation of explicit deepfake images was introduced as an amendment to the **Data (Use and Access) Act 2025**. Although the law received royal assent in **July 2022**, enforcement only began recently.

Many campaigners, including Jodie, expressed frustration over the delays in implementing the law. “We had these amendments ready to go with royal assent before Christmas,” she said. “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”

In January, **Leicestershire Police** launched an investigation into a case involving sexually explicit deepfakes created by **Grok AI**. Madelaine Thomas, a sex worker and founder of the tech forensics company **Image Angel**, also weighed in on the implications of the new law. “It was a very emotional day for me and other victims,” she said. However, she noted that the law does not adequately protect sex workers from intimate image abuse.

“When commercial sexual images are misused, they’re only viewed as a copyright breach. I respect that,” Thomas explained. “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialized intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”

For the past seven years, Thomas has endured the non-consensual sharing of her intimate images almost daily. “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that,” she disclosed.

According to the domestic abuse charity **Refuge**, one in three women in the UK has experienced online abuse. The campaign group **Stop Image-Based Abuse** comprises the **End Violence Against Women Coalition**, the victim advocacy group **#NotYourPorn**, **Glamour UK**, and **Clare McGlynn**, a professor of law at **Durham University**.

A spokesperson for the **Ministry of Justice** stated, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too. But we’re not stopping there. We’re going after the companies behind these ‘nudification’ apps, banning them outright so we can stop this abuse at source.”

Furthermore, the technology secretary confirmed that creating non-consensual sexual deepfakes will be prioritized under the **Online Safety Act**, which will impose additional responsibilities on platforms to prevent such content from surfacing.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Government

UK government condemns X's Grok AI policy as "insulting" to abuse victims, urging an investigation into the monetization of harmful image-editing tools.

AI Government

UK government considers a boycott of X as Ofcom readies regulatory action over Grok AI's deepfake scandal involving minors, calling it a "disgrace."

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.