Meta has taken legal action against the developer of a popular AI “nudify” app, Crush AI, for running numerous ads on Meta’s platforms. The lawsuit, filed in Hong Kong, accuses Joy Timeline HK, the company behind Crush AI, of trying to bypass Meta’s ad review process to promote AI nudify services. Despite Meta’s repeated removal of these ads for policy violations, Joy Timeline HK continued to place them on Meta’s platforms.
According to reports, Crush AI utilized generative AI to create fake sexually explicit images of individuals without their consent. In the first two weeks of 2025, Crush AI allegedly ran over 8,000 ads for its “AI undresser” services on Meta’s platform, garnering a significant portion of its website traffic from Facebook and Instagram. The developer employed various tactics to evade Meta’s ad review processes, including creating multiple advertiser accounts with changing domain names and using misleading account names.
The proliferation of AI undressing apps on social media platforms like X and Meta has raised concerns about user safety, especially for minors. Researchers have observed a surge in links to AI undressing apps on platforms like X and Reddit, as well as the widespread promotion of such apps through ads on YouTube. In response to this growing issue, Meta and TikTok have banned keyword searches for AI nudify apps, but completely removing these services from their platforms remains a challenge.
To combat the spread of AI nudify services, Meta has developed new technology to identify and remove ads for such services, even those that do not contain explicit content. The company has enhanced its ad monitoring systems to quickly detect and eliminate copycat ads, as well as expanded its list of flagged terms and phrases. Additionally, Meta has disrupted multiple networks promoting AI nudify services since the beginning of 2025.
Beyond its digital platforms, Meta is collaborating with other tech companies through the Tech Coalition’s Lantern program to share information about AI nudify apps and prevent child sexual exploitation online. The company has shared thousands of URLs related to these apps with the coalition to aid in addressing this issue. On the legislative front, Meta is advocating for laws that empower parents to oversee and approve their teens’ app downloads, supporting initiatives like the US Take It Down Act.
In conclusion, Meta’s proactive measures against AI nudify apps reflect its commitment to safeguarding user privacy and safety on its platforms. By leveraging technology, collaboration, and advocacy, Meta aims to create a safer online environment for all users, particularly in the face of emerging challenges posed by AI-generated content.