Connect with us

Hi, what are you looking for?

AI Research

Teen Girls and Boys Equally Use AI Nudification Apps, Study Reveals

Study reveals 55% of teens, including girls, create AI-generated sexualized images, prompting urgent calls for parental engagement and awareness.

A recent study reveals that teenage girls are using nudification apps at rates comparable to their male counterparts. These **artificial intelligence-powered** tools, which allow users to create sexualized images by uploading photographs, have raised alarm among researchers studying technology’s impact on youth. The findings were published Wednesday in the journal **PLOS One** by Dr. Chad M.S. Steel, a digital forensics researcher at **George Mason University**, who specializes in technology-facilitated crimes against children.

Dr. Steel expressed surprise at the results, noting that male adolescents typically engage more in online sexual activities. “Males tend to be more involved in any type of online sexual endeavors, whether it’s sexting or viewing pornographic material or the like; there’s usually a much stronger signal for males than females,” he stated. In January 2025, Steel conducted an online survey involving 557 English-speaking adolescents aged 13 to 17. He found that **55%** of respondents had created a sexualized image, while **54%** reported receiving one. Alarmingly, over a third of teens said they had been victims of non-consensual imagery, with a similar percentage indicating that their own images had been shared without permission.

Interestingly, about **1 in 6** teens reported frequently using nudification tools to see how they looked, and a similar share of girls said they had shared such images “once or twice.” A slightly lower percentage of boys reported the same behavior. Steel did not inquire about the reasons behind the teens’ use of these tools, but he suspects that the popularity of apps allowing girls to visualize clothing and makeup may have created a familiarity that translates to nudification tools. This could be exacerbated by social pressures from male peers to produce sexually explicit imagery.

Dr. Linda Charmaraman, director of the **Youth, Media, & Wellbeing Research Lab** at **Wellesley College**, reviewed the findings and noted the vulnerability of adolescents in this developmental stage. “Teens are in a delicate developmental period as they form their identities and seek social connection and acceptance. When you combine that time of development with AI, it can bring further risks,” she remarked. According to the study, while boys reported higher usage of generative AI for creating and distributing sexual imagery, both genders are engaging with these tools at significant rates.

Steel expressed a desire for further research on this topic, aiming to replicate the study with a larger sample size. “In this case, I’d love to find out that I had an extremely unusual subset,” he remarked. Charmaraman acknowledged the survey’s nationally representative sample, but raised concerns about the potential bias in how the survey was advertised, which may have attracted more “technology-savvy” participants and skewed the results.

The findings have prompted calls for parents to address the issue of nudification imagery with their children. Steel emphasized that the use of such tools has become normalized among teens, leaving adults unaware of potential consequences. He encouraged parents to engage nonjudgmentally with their children about the risks associated with nudification tools, highlighting that merely focusing on abstention is unlikely to be effective.

Charmaraman recommended ongoing discussions about the digital lives of teens, suggesting that fostering an open line of communication can help parents navigate sensitive issues like non-consensual sharing of AI-generated sexual imagery. Instead of immediately restricting access to apps, she advocates understanding the motives behind such behaviors, such as peer pressure.

Steel also highlighted the misunderstanding among teens regarding the legality of the imagery they create. Many do not realize that their actions can lead to the creation of what constitutes child sexual abuse material. However, he noted that it is unlikely they would face legal repercussions if such images were shared consensually among peers. He proposed that policymakers consider a bystander approach, encouraging teens to report instances where peers might be victimized by AI-generated sexual imagery.

Additionally, both Steel and Charmaraman stressed the importance of online safety measures, especially regarding the risk of sextortion. Predators may seek out child sexual abuse material and could use nudification apps to create sexualized images based on publicly available photos. Both experts believe that teens should be educated on safeguarding their digital presence, including keeping social media accounts private and limiting access to trusted followers.

As technology continues to evolve, so too do the challenges faced by young people. The widespread use of nudification tools among teens raises critical questions about the intersection of adolescent development, social pressures, and the implications of artificial intelligence. For those affected by non-consensual imagery, resources are available, including the **Cyber Civil Rights Initiative’s** 24/7 hotline at **844-878-2274**, which offers free, confidential support.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Tools

Alibaba Cloud raises service prices by up to 34% due to surging AI demand and rising supply chain costs, affecting numerous instances and hardware.

AI Technology

Fitch Ratings warns that credit risks from AI adoption could surge in tech and media sectors, with hyperscalers like Alphabet and Microsoft investing $650B...

AI Tools

Autodesk's March 25 webinar will showcase AI tools that can cut CAD documentation time by up to 50%, revolutionizing engineering workflows and enhancing product...

Top Stories

Amazon introduces three customizable personality styles—Brief, Chill, and Sweet—for Alexa+, enhancing user interaction and personalization in AI technology.

AI Technology

Amazon's CloudFront outage disrupts access for users, highlighting vulnerabilities in cloud services amid rising internet traffic and pushing businesses to reconsider their reliance on...

Top Stories

Character.AI bans open-ended chats for users under 18 amid legal pressure, citing safety concerns after a lawsuit linked its platform to severe harm, including...

Top Stories

Salesforce reports 13% growth slowdown, projecting 7-8% revenue increase for fiscal 2027 amid rising Agentforce adoption and mixed cloud segment performance.

AI Research

Pearson's AI study tools tripled active reading engagement for 400,000 students, showing a significant link to improved academic performance.

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.