The fake nudity crisis
AI “nudify” tools are violating the human body and distorting the meaning of nudity

In 2020, British deepfake researcher Henry Ajder stumbled across something alarming. A Telegram bot was circulating in teen circles across Europe, offering to strip the clothes off anyone in a photo. All it needed was an upload—often pulled from social media—and in seconds it would return a synthetic nude of the subject. Most of the images featured young women. Many were of girls under 18. No one had consented. No one even knew it was happening.
“Five years later … a network of 85 nudify websites are drawing over 18.5 million visitors per month and may be generating up to $36 million a year.”
By the time researchers began tracking its spread, the bot had processed more than 100,000 images. Dozens of copycat bots soon followed, reaching millions of users.1 Telegram ultimately removed many of them, but by then, the model was clear: artificial nudity had become a viral tool of control, distributed for free and optimized for abuse.
Five years later, this “nudify” technology has matured into a dark, sprawling ecosystem. According to new reporting by WIRED, a network of 85 nudify websites are drawing over 18.5 million visitors per month and may be generating up to $36 million a year.2 Their content is entirely nonconsensual—AI-generated nude and pornographic images of people who never chose to be naked, never posed, never agreed.
Let’s be clear: this isn’t nudity. It’s digital sexual violence. And for those of us who believe in body freedom and nude expression, this moment calls for moral clarity. Because if the world can’t tell the difference between nudity and exploitation, the very idea of ethical, chosen nudity is at risk of being erased.
Inverting the naturist premise
Naturism at its core is built on a deceptively radical idea: that the human body, uncovered, need not be shameful or objectified. Nudity, in a naturist context, is an act of presence and trust, not provocation. Opting in is fundamental. It’s about creating space for the naked body to represent nothing more than a person being themselves, with no threat of being viewed as an invitation.
Nudify tools offer the inverse. They strip away autonomy. Their premise is not presence or acceptance, but power: the ability to control and digitally disrobe someone without their consent—to manufacture an illusion of exposure and call it truth. From inception, this is abusive. But in practice, it becomes even more dangerous, as these synthetic images are used not only for private gratification but also for blackmail, harassment, and extortion. Even in cases where the images are kept private, it’s still someone’s likeness—used without consent, often without their knowledge.
In Ohio, a 37-year-old man named James Strahler was recently arrested for using AI to generate fake pornography of at least 10 women, including ex-girlfriends and their family members. He sent the images to their coworkers and relatives. In some cases, he demanded real nude images in exchange for silence. In others, he simply used the images as threats. He even manipulated photos of a victim’s child.3
What’s especially troubling is that this is no longer a fringe tool used by bored teens or 4chan trolls. It’s a global business, backed by major tech infrastructure. Its growth has been amplified by mainstream social media platforms hosting its ads and boosting its reach.
The trouble with big tech
A recent investigation by Indicator found that many of the most popular nudify sites are built on services from Google, Amazon, and Cloudflare. Dozens use Google’s single sign-on tools and host their payment systems with major processors, while raking in thousands through credits and subscriptions.4
Meta’s role in the spread of nudify tools illustrates just how negligent mainstream tech platforms have been in enabling this abuse. Despite clear violations of its policies, Meta served thousands of ads promoting AI “nudify” apps across Facebook, Instagram, and Threads—ads that often targeted men with tools to digitally undress women without consent.5
In February, Senator Dick Durbin publicly condemned Meta’s inaction in a letter to CEO Mark Zuckerberg, calling it a “perverse abuse” of its platforms and demanding accountability for the company’s role in enabling digital sexual exploitation.6 Months later, a CBS News investigation exposed the scope of the problem, revealing that hundreds of nudify app ads had run on Meta’s platforms, many still active even after removal efforts began. Only after this wave of public and political scrutiny did Meta file a lawsuit in June against the Hong Kong-based developer of CrushAI, one of the apps that had repeatedly circumvented Meta’s ad review system.7 The case highlights a broader failure: tech companies haven’t merely failed to stop the spread of abusive deepfake tools—they’ve profited from hosting and promoting them.
This is not the future. This is now. And the victims are often everyday women—teachers, students, moms, minors—whose faces were scraped from public profiles and pasted onto bodies not their own. Even those who have never shared nude images are vulnerable. That’s the core horror of this technology: you don’t have to do anything for it to find you. When nudity becomes something that can be done to you, it stops being about freedom. It becomes a threat. And it chills the cultural space in which naturism needs a little warmth to thrive.
The cultural cost
If the dominant narrative around nudity is one of violation—if the public comes to associate “nudity” with harm, manipulation, and synthetic pornography—then all forms of body freedom suffer.
Naturists have long argued that nudity can be ordinary. That the sight of a human body need not be sexualized or objectified. But these AI tools make that argument harder to win. They collapse the distinction between nude and naked, between revealing and stripping, between shared vulnerability and coerced exposure.

Even platforms that claim to protect “real” nudity—like artistic photography or naturist content—often fail to draw this line clearly. As any online naturist will confirm, it’s incredibly common for genuine naturist posts to be flagged or removed on social media, while exploitative deepfake nudity is monetized in the background. In trying to regulate “nude content,” platforms often penalize authenticity while enabling abuse.
So what can naturists say in this moment?
We can say that nudity, in its truest form, is not about seeing but being seen—with permission. That a world of synthetic bodies, generated for profit and humiliation, is not the future we want. That real nudity demands real consent.
More importantly, we can model the difference. We can continue to show what body autonomy looks like when it’s not stolen but shared. We can insist that the human form deserves respect, not because it’s hidden, but because it belongs to someone.
The naturist tradition is rooted in consent, equality, and human dignity. Those values have never mattered more. If we do not defend them now, we risk losing the cultural space to assert them later. 🪐
Caramela, S. (2024, October 16). ‘Nudify’ deepfake bots on Telegram are up to 4 million monthly users. VICE. https://www.vice.com/en/article/nudify-deepfake-bots-telegram/?utm_source=chatgpt.com
Burgess, M. (2025, July 14). AI ‘nudify’ websites are raking in millions of dollars. WIRED. https://www.wired.com/story/ai-nudify-websites-are-raking-in-millions-of-dollars/
Cole, S. (2025, June 26). A deepfake nightmare: Stalker allegedly made sexual AI images of ex-girlfriends and their families. 404 Media. https://www.404media.co/deepfake-harassment-ohio-undress-clothoff-nudify-apps/
(Burgess, 2025).
Weatherbed, J. (2025, June 12). Meta cracks down on nudify apps after being exposed. The Verge. https://www.theverge.com/news/685985/meta-lawsuit-crushai-nudify-app-ads
Durbin, R. (2025, February 11). Durbin presses Zuckerberg on Meta's role directing traffic to problematic nudify app [Press release]. United States Senate Committee on the Judiciary. https://www.judiciary.senate.gov/press/dem/releases/durbin-presses-zuckerberg-on-metas-role-directing-traffic-to-problematic-nudify-app
Maiberg, E. (2025, June 12). Meta sues nudify app that keeps advertising on Instagram. 404 Media. https://www.404media.co/meta-sues-nudify-app-that-keeps-advertising-on-instagram/
Very interesting article Evan. Just today, we were informed by a friend that their daughter’s football coach was arrested for using AI to create nude photos of underage girls and put them online in the dark web. His daughter was one of them.
And then we see a sitting president post a fake AI video of a former president being arrested. There are evil people in the world.
These aren’t just isolated incidents. They represent two of the most dangerous faces of AI abuse. The sexual exploitation of images without consent. And the political manipulation of reality for personal gain. This is what AI can do now… and it’s only getting easier to misuse.
Today, any image or video you’ve ever shared… even fully clothed, family-friendly, and entirely innocent, can now be stripped of your clothing by AI. A beach photo. A summer hike. A birthday party video. If you’re in it, someone can now make a fake version of you nude, and no one else may know the difference.
It doesn’t matter if you’re a naturist, a public figure, or just someone who posted a happy vacation memory on Instagram. AI doesn’t need your consent to undress you.
This is the very thing naturists have spent years fighting against. The idea that nudity can be stolen, sexualized, or used as a weapon. And now, thanks to AI, the tools to do that are sitting in people’s pockets.
But... we’ve also been thinking about this from a nudism or naturism perspective. And we can’t help but wonder: Could this scary technology accidentally free some people from the fear of being seen nude?
If anyone can have a fake nude made of them, whether they’re naturists or not, then what exactly is anyone hiding from anymore? The threat isn’t what you’ve actually done. The threat now comes from what someone else can fabricate.
And suddenly, simply being open about your real-life naturism... starts to seem a lot less risky.
People even use #naturist #nudist when they have swimsuits on