A recent study reveals that teenage girls are using nudification apps at rates comparable to their male counterparts. These **artificial intelligence-powered** tools, which allow users to create sexualized images by uploading photographs, have raised alarm among researchers studying technology’s impact on youth. The findings were published Wednesday in the journal **PLOS One** by Dr. Chad M.S. Steel, a digital forensics researcher at **George Mason University**, who specializes in technology-facilitated crimes against children.
Dr. Steel expressed surprise at the results, noting that male adolescents typically engage more in online sexual activities. “Males tend to be more involved in any type of online sexual endeavors, whether it’s sexting or viewing pornographic material or the like; there’s usually a much stronger signal for males than females,” he stated. In January 2025, Steel conducted an online survey involving 557 English-speaking adolescents aged 13 to 17. He found that **55%** of respondents had created a sexualized image, while **54%** reported receiving one. Alarmingly, over a third of teens said they had been victims of non-consensual imagery, with a similar percentage indicating that their own images had been shared without permission.
Interestingly, about **1 in 6** teens reported frequently using nudification tools to see how they looked, and a similar share of girls said they had shared such images “once or twice.” A slightly lower percentage of boys reported the same behavior. Steel did not inquire about the reasons behind the teens’ use of these tools, but he suspects that the popularity of apps allowing girls to visualize clothing and makeup may have created a familiarity that translates to nudification tools. This could be exacerbated by social pressures from male peers to produce sexually explicit imagery.
Dr. Linda Charmaraman, director of the **Youth, Media, & Wellbeing Research Lab** at **Wellesley College**, reviewed the findings and noted the vulnerability of adolescents in this developmental stage. “Teens are in a delicate developmental period as they form their identities and seek social connection and acceptance. When you combine that time of development with AI, it can bring further risks,” she remarked. According to the study, while boys reported higher usage of generative AI for creating and distributing sexual imagery, both genders are engaging with these tools at significant rates.
Steel expressed a desire for further research on this topic, aiming to replicate the study with a larger sample size. “In this case, I’d love to find out that I had an extremely unusual subset,” he remarked. Charmaraman acknowledged the survey’s nationally representative sample, but raised concerns about the potential bias in how the survey was advertised, which may have attracted more “technology-savvy” participants and skewed the results.
The findings have prompted calls for parents to address the issue of nudification imagery with their children. Steel emphasized that the use of such tools has become normalized among teens, leaving adults unaware of potential consequences. He encouraged parents to engage nonjudgmentally with their children about the risks associated with nudification tools, highlighting that merely focusing on abstention is unlikely to be effective.
Charmaraman recommended ongoing discussions about the digital lives of teens, suggesting that fostering an open line of communication can help parents navigate sensitive issues like non-consensual sharing of AI-generated sexual imagery. Instead of immediately restricting access to apps, she advocates understanding the motives behind such behaviors, such as peer pressure.
Steel also highlighted the misunderstanding among teens regarding the legality of the imagery they create. Many do not realize that their actions can lead to the creation of what constitutes child sexual abuse material. However, he noted that it is unlikely they would face legal repercussions if such images were shared consensually among peers. He proposed that policymakers consider a bystander approach, encouraging teens to report instances where peers might be victimized by AI-generated sexual imagery.
Additionally, both Steel and Charmaraman stressed the importance of online safety measures, especially regarding the risk of sextortion. Predators may seek out child sexual abuse material and could use nudification apps to create sexualized images based on publicly available photos. Both experts believe that teens should be educated on safeguarding their digital presence, including keeping social media accounts private and limiting access to trusted followers.
As technology continues to evolve, so too do the challenges faced by young people. The widespread use of nudification tools among teens raises critical questions about the intersection of adolescent development, social pressures, and the implications of artificial intelligence. For those affected by non-consensual imagery, resources are available, including the **Cyber Civil Rights Initiative’s** 24/7 hotline at **844-878-2274**, which offers free, confidential support.
See also
AI Study Reveals Generated Faces Indistinguishable from Real Photos, Erodes Trust in Visual Media
Gen AI Revolutionizes Market Research, Transforming $140B Industry Dynamics
Researchers Unlock Light-Based AI Operations for Significant Energy Efficiency Gains
Tempus AI Reports $334M Earnings Surge, Unveils Lymphoma Research Partnership
Iaroslav Argunov Reveals Big Data Methodology Boosting Construction Profits by Billions




















































