Victims of deepfake image abuse in the UK have intensified their calls for stronger protections against **AI-generated explicit images** following the recent enforcement of a law criminalizing the creation of **non-consensual intimate images**. Campaigners from **Stop Image-Based Abuse** delivered a petition to **Downing Street** containing over **73,000 signatures**, urging the government to introduce civil justice routes, such as takedown orders for abusive imagery on various platforms.
“Today’s a really momentous day,” said a victim of deepfake abuse, who identified herself as Jodie, using a pseudonym. “We’re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it,” she added.
Among the demands in the petition are calls for improved relationships and sex education, alongside adequate funding for specialist services like the **Revenge Porn Helpline**, which assists victims of intimate image abuse. Jodie, in her 20s, discovered deepfake pornography featuring her in **2021**. She, alongside 15 other women, testified against **Alex Woolf**, a **26-year-old** convicted of posting images of women from social media to pornographic websites, resulting in a sentence of **20 weeks in prison**.
“I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” Jodie remarked, reflecting on her experience. The new offence regarding the creation of explicit deepfake images was introduced as an amendment to the **Data (Use and Access) Act 2025**. Although the law received royal assent in **July 2022**, enforcement only began recently.
Many campaigners, including Jodie, expressed frustration over the delays in implementing the law. “We had these amendments ready to go with royal assent before Christmas,” she said. “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”
In January, **Leicestershire Police** launched an investigation into a case involving sexually explicit deepfakes created by **Grok AI**. Madelaine Thomas, a sex worker and founder of the tech forensics company **Image Angel**, also weighed in on the implications of the new law. “It was a very emotional day for me and other victims,” she said. However, she noted that the law does not adequately protect sex workers from intimate image abuse.
“When commercial sexual images are misused, they’re only viewed as a copyright breach. I respect that,” Thomas explained. “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialized intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”
For the past seven years, Thomas has endured the non-consensual sharing of her intimate images almost daily. “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that,” she disclosed.
According to the domestic abuse charity **Refuge**, one in three women in the UK has experienced online abuse. The campaign group **Stop Image-Based Abuse** comprises the **End Violence Against Women Coalition**, the victim advocacy group **#NotYourPorn**, **Glamour UK**, and **Clare McGlynn**, a professor of law at **Durham University**.
A spokesperson for the **Ministry of Justice** stated, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too. But we’re not stopping there. We’re going after the companies behind these ‘nudification’ apps, banning them outright so we can stop this abuse at source.”
Furthermore, the technology secretary confirmed that creating non-consensual sexual deepfakes will be prioritized under the **Online Safety Act**, which will impose additional responsibilities on platforms to prevent such content from surfacing.
See also
Asian Markets Rally 3% as Japan’s Takaichi Victory Fuels AI Investment and Reflation Hopes
CIOs in Asia/Pacific to Boost Sovereign AI Investments by 50% Amid Governance Risks by 2028
Microsoft Expands Foundry with Gated Hugging Face Models for Secure AI Deployment
Sauvegarder Investment Merges with Garden Intel for $150M to Revolutionize Patent Monetization
Germany”s National Team Prepares for World Cup Qualifiers with Disco Atmosphere


















































