Apple and Google have taken action to remove numerous nudify apps from their app stores, which allowed users to exploit artificial intelligence to create nude images of individuals. The Tech Transparency Project (TTP) reported that their investigation uncovered 55 such apps on Google Play and 47 on the App Store. After alerting both companies, Apple removed 28 of these apps and warned developers that further violations could lead to more removals.
Two of the applications were later reinstated following compliance with guidelines. A Google spokesperson confirmed that the company had suspended several programs and is currently reviewing the situation in light of these reports. TTP experts criticized both companies for hosting applications that can transform innocent photographs into sexually explicit images without consent.
“Both firms claim to care about user safety, yet they host apps that can turn a harmless photo of a woman into an offensive sexual image,” TTP stated in their findings.
The TTP identified the apps by searching for terms like “nudify” and “undress,” and conducted tests involving AI-generated images. Their analysis revealed two types of services: those that render images of women without clothing and those that superimpose faces onto explicit photos. Katie Paul, the director of TTP, emphasized the malicious intent behind these apps.
“It is clear these are not just ‘clothing change’ apps. They are clearly designed to sexualize individuals without their consent,” Paul said.
Among the identified apps, 14 originated in China, raising additional concerns. Paul pointed out that China’s data storage laws grant the government access to any company’s information, meaning that if deepfakes were created using an individual’s image, that data could potentially be in the hands of authorities.
“China’s data storage laws mean the government has the right to access any company’s information anywhere in the country. So if someone created deepfakes with your image, they are now in the hands of the authorities,” Paul added.
The misuse of artificial intelligence has facilitated the creation of deepfake pornography, prompting legal actions. In January, the chatbot Grok faced backlash over a similar feature and subsequently disabled its capability to generate explicit images of real people. Furthermore, in August 2024, San Francisco City Attorney David Chiu’s office filed a lawsuit against the owners of 16 websites that allow users to undress women and girls in photos using AI without consent. This lawsuit cites violations of both state and federal laws pertaining to deepfake pornography and child sexual abuse material.
“Generative AI holds great promise, but as with all new technologies, there are unforeseen consequences and criminals looking to exploit new technology for their own ends. We must be clear that this is not innovation — it is sexual violence,” Chiu stated.
The websites implicated in the lawsuit provide user-friendly platforms for uploading photos to create realistic pornographic versions. These images are often nearly indistinguishable from real ones and are reportedly used for extortion, intimidation, and humiliation. In an effort to combat this emerging threat, Microsoft announced a partnership in September 2024 with the organization StopNCII to address the issue of deepfake pornography in its Bing search engine.
The actions taken by Apple and Google reflect a growing concern over the implications of AI technology in the realm of personal privacy and consent. With the rapid advancements in AI capabilities, regulators and technology firms alike are grappling with how to enforce ethical standards and protect individuals from misuse. As the debate over AI’s potential and its dangers continues, the need for comprehensive measures becomes increasingly urgent.
See also
AI Transforms Health Care Workflows, Elevating Patient Care and Outcomes
Tamil Nadu’s Anbil Mahesh Seeks Exemption for In-Service Teachers from TET Requirements
Top AI Note-Taking Apps of 2026: Boost Productivity with 95% Accurate Transcriptions



















































