I've been constantly warning of this for 2 years or more now... It's more urgent than ever that the naturist community take a HARD stand against this technology.
Why accept the idea that nude pictures, whether faked or not, demonstrate moral depravity of the pictured person ? Do we take the position that nudity is nothing to be ashamed of or not ?
I happen to believe that there is no shame in nudity. Therefore, it is foolish to pander to the so-called victims of digital undressing and justify claims of deep psychological harm.
Instead, shame the shamers who claim that nude pictures prove moral depravity. Such pictures prove only that the subjects have the same organs as one or the other half of the population, more or less. Who cares ?
Rising up in outrage over these pics is ridiculous. AI fakes are serious when they are unlabelled and depict criminal acts, sure, but nude is not lewd, and not criminal. I hope everyone here comes around and agrees. Otherwise, you're asking for meddling where it is not needed, adding costs to taxpayers, and causing real harm in criminal litigation.
I don’t accept that nudity, real or fabricated, signals moral depravity. But I also don’t think this is really about nudity at all. Grok isn’t producing neutral naked bodies in a naturist sense. It’s placing people into sexualized poses and implied scenarios they didn’t choose. Body freedom isn’t freedom to sexualize others. As a naturist, I reject the conflation of nudity and sex, and these deepfakes reinforce exactly that confusion. The harm isn’t from the nudity It’s from being forced into a sexual context without consent. And that harm becomes far more serious and legally consequential when the subject is a minor.
OK, thoughtful reply, but I still want to challenge the outrage. Sexual activity is also normal and natural, and nothing to be ashamed about. If somebody chooses to shame the pictured parties, then shame on them. If they feel offended, they can look away, but no, they choose to stick their nose deep into other people's business. And frankly, that is really a form of sexual harassment.
So no, shame the shamers. Do not justify them when they bully others with the formidable harms caused by the judicial system. That's institutionalized sexual harassment.
Since the outrage is misplaced, so is the sensitivity to fake images.
Thank you for your thoughtful engagement here, I appreciate the dialogue. I understand the libertarian impulse behind your argument, and I agree that consensual sex and sexual imagery are normal, human, and not shameful. I’m not interested in policing desire or defending moral panic. Where I diverge is in treating this as a matter of individual offense or people refusing to look away. What’s at issue here is a commercial platform selling and distributing a technology that generates nonconsensual sexual imagery and then amplifies it across a global network. That changes the nature of the harm.
The distinction that matters is consent. Consensual sexual expression and coerced sexual representation are not interchangeable, and collapsing them obscures what’s actually happening. When a system takes someone’s personal likeness, imposes sexual meaning onto it without permission, and circulates the result at scale, that constitutes abuse regardless of whether the image is nude, partially clothed, or fully synthetic. Nudity itself isn’t the issue. The platform and the technology that erase consent are.
Pictures taken in public are not consensual, but they are legal. Celebrities do not have legal recourse, nor does anyone else (except in some jurisdictions for children). My issue is with engaging the machinery of the criminal justice system. Insecure people tend to become bullies. Their insecurity becomes harassment when they get assistance in playing the victim card and use the criminal justice system to bully people. There are always some cops who enjoy that, especially young males eager to dominate other men or pander to attractive women. Nudists are targets. So no, we should never legitimize the whining. We do not care if a real picture of us nude is in circulation, and we need to take a stand when people claim psychological harm needing legal remedy, whether for seeing naked people or having their photos digitally stripped. If you happen to have a daughter upset that there’s a naked photo of her circulating at her school, the best response is to tell her to buck up and shame the shamers, “Everybody has those body parts. And by the way, you only need to be concerned with yours. Your interest in my sexual bits is not mutual. It is excessive and, by the way, you suck… at math.” https://www.google.com/search?q=prurient+meaning&oq=prurient&gs_lcrp=EgZjaHJvbWUqCggBEAAYsQMYgAQyEQgAEEUYORhGGPkBGLEDGIAEMgoIARAAGLEDGIAEMgcIAhAAGIAEMgcIAxAAGIAEMgcIBBAAGIAEMgcIBRAAGIAEMgcIBhAAGIAEMgcIBxAAGIAEMgcICBAAGIAEMgcICRAAGIAE0gEINjQ0NmowajeoAgCwAgA&sourceid=chrome&ie=UTF-8
You are still missing the point. This is not an either/or situation. The 'shamers' obviously must be dealt with, but the victims of these non-consensual and illegal activities are NOT 'whiners', and are CERTAINLY not abusers and harrassers. There is a massive difference between having your photo taken in public and having that photo then turned into a sexual or nude depiction that didn't originally exist. One has far different expectations than the other. We simply don't live in a society where telling people to ignore abusive fakes of them is in any way a proper response. Not only does this legitimize and embolden the actions of the people creating these images, but it tells the victims that their own autonomy and concerns and rights are not respected. You don't have the right to decide how anyone else should feel about THEIR body and privacy of such. Period.
Grok is an AI, it will produce whatever you tell it to produce, it doesn't have a mind on its own.. I've always said that the more you protect something, the easier it is to use it against the same You're protecting… . I know new this talk about consent a lot, but thing is this, yes, if you good on nude beach of course everybody will be naked and who cares, but it's clothing optional. Really a consent? I mean it is as long as there is absolutely fully textile places as well, then you have all the options and you can pick whatever you want or like.
What I actually don't understand is people are so obsessed with sex and sexuality. Not in a sense of the joy of it, but in a sense of being so butthurt about it. That is actually what causes the problem as I already mentioned the more you protect something, the more it will be used against you. Also, it's very funny that in the article it's mentioned women and children once again who gives a shit about men right? And no I don't use AI. I actually hate AI, because of this at all, but because you don't know anymore, what's real and not, I have heard music and sadly very good music made by AI fully, I've seen videos. Mostly these nostalgic '80s vibes that look totally real, so soon we will have movies made by AI.. you won't need actors anymore or bands.. and that's what I'm pissed about, not somebody making some stupid fake nude or xxx photos of whoever.. there's so much of it already that I'm still shocked at. People are so butthurt hurt about seeing whatever. Yet at the same time nobody is asking about the consents we give even to this app, to governments to track your every move, to have digital IDs, to have chat controls, to slowly but surely lose physical money and use only digital.. how about we talk about real fucking problems???
I don’t think reporting on this is alarmist or “butthurt.” Multiple countries are actively investigating X right now. Regulators in Europe, the U.K., India, Brazil, and elsewhere are treating Grok as a serious issue involving nonconsensual sexual imagery and child protection. That alone makes it news. Reporting on active investigations isn’t a moral panic, it’s basic literacy about what’s happening in the world. An AI isn’t a wild animal or a force of nature. It’s a product, built and deployed by a company that controls how it works and how far its outputs spread. We regulate media, pharmaceuticals, vehicles, and other technologies precisely because they cause predictable harm at scale. Treating Grok as something that simply “does whatever it’s told” lets the platform off the hook for designing and amplifying abuse.
This also isn’t about being anti-sex. Consensual sex and sexual imagery aren’t the issue. The issue is nonconsensual sexual imagery generated from someone’s personal likeness and broadcast through a global network. Women and children come up repeatedly because they are, by far, the primary victims of sexualized abuse, including digital abuse, and most of that harm is perpetrated by men. The fact that people in body-positive communities disagree so strongly about whether this matters is exactly why it’s worth covering.
There is no shame in CONSENSUAL nudity. If we lived in a naturist utopia, sure, you might have a basic thread of a point... But it would still be erroneous. We live in a society where people lose careers, family and lives over perceived (or real) displays of nudity, sexualized or not. This technology is becoming indistinguishable from reality for most casual viewers and the last dozen years or so have proven we are now a people that judge first and ask questions later, if at all. So no. This isn't about shame... It's about weaponization and stripping people of their right to autonomy. Even naturists who share other naturist's photos without express permission or credit fall into this category, but doing it to someone that has a reasonable expectation of not falling victim to harrassment and hate crimes?
As long time nudists, my late wife and I had as major rules - respect the life styles of others and cause them no embarrassment.
To electronically strip people of their clothing to present them in a sexualised context is vile and despicable. But, for the Musks of this world , the only factor to be considered is whether or not their is a quid in it for them.
Nothing wrong with showing the whole body, but only if the subject consents. Problem is controlling access to platforms. Australia is trying this with child restrictions, but the kids can manipulate platforms much better than adults. It sounds tedious, but is it possible to encourage subjects to sue for heavy damages if their images are manipulated and appear without consent? States could automatically provide legal aid for this irrespective of income and those reposting would also be liable. I’d like to bankrupt the Musks, but don’t see how this could be done. At least it might get rid of a few of their paying customers.
I've been constantly warning of this for 2 years or more now... It's more urgent than ever that the naturist community take a HARD stand against this technology.
Why accept the idea that nude pictures, whether faked or not, demonstrate moral depravity of the pictured person ? Do we take the position that nudity is nothing to be ashamed of or not ?
I happen to believe that there is no shame in nudity. Therefore, it is foolish to pander to the so-called victims of digital undressing and justify claims of deep psychological harm.
Instead, shame the shamers who claim that nude pictures prove moral depravity. Such pictures prove only that the subjects have the same organs as one or the other half of the population, more or less. Who cares ?
Rising up in outrage over these pics is ridiculous. AI fakes are serious when they are unlabelled and depict criminal acts, sure, but nude is not lewd, and not criminal. I hope everyone here comes around and agrees. Otherwise, you're asking for meddling where it is not needed, adding costs to taxpayers, and causing real harm in criminal litigation.
I don’t accept that nudity, real or fabricated, signals moral depravity. But I also don’t think this is really about nudity at all. Grok isn’t producing neutral naked bodies in a naturist sense. It’s placing people into sexualized poses and implied scenarios they didn’t choose. Body freedom isn’t freedom to sexualize others. As a naturist, I reject the conflation of nudity and sex, and these deepfakes reinforce exactly that confusion. The harm isn’t from the nudity It’s from being forced into a sexual context without consent. And that harm becomes far more serious and legally consequential when the subject is a minor.
OK, thoughtful reply, but I still want to challenge the outrage. Sexual activity is also normal and natural, and nothing to be ashamed about. If somebody chooses to shame the pictured parties, then shame on them. If they feel offended, they can look away, but no, they choose to stick their nose deep into other people's business. And frankly, that is really a form of sexual harassment.
So no, shame the shamers. Do not justify them when they bully others with the formidable harms caused by the judicial system. That's institutionalized sexual harassment.
Since the outrage is misplaced, so is the sensitivity to fake images.
Thank you for your thoughtful engagement here, I appreciate the dialogue. I understand the libertarian impulse behind your argument, and I agree that consensual sex and sexual imagery are normal, human, and not shameful. I’m not interested in policing desire or defending moral panic. Where I diverge is in treating this as a matter of individual offense or people refusing to look away. What’s at issue here is a commercial platform selling and distributing a technology that generates nonconsensual sexual imagery and then amplifies it across a global network. That changes the nature of the harm.
The distinction that matters is consent. Consensual sexual expression and coerced sexual representation are not interchangeable, and collapsing them obscures what’s actually happening. When a system takes someone’s personal likeness, imposes sexual meaning onto it without permission, and circulates the result at scale, that constitutes abuse regardless of whether the image is nude, partially clothed, or fully synthetic. Nudity itself isn’t the issue. The platform and the technology that erase consent are.
Pictures taken in public are not consensual, but they are legal. Celebrities do not have legal recourse, nor does anyone else (except in some jurisdictions for children). My issue is with engaging the machinery of the criminal justice system. Insecure people tend to become bullies. Their insecurity becomes harassment when they get assistance in playing the victim card and use the criminal justice system to bully people. There are always some cops who enjoy that, especially young males eager to dominate other men or pander to attractive women. Nudists are targets. So no, we should never legitimize the whining. We do not care if a real picture of us nude is in circulation, and we need to take a stand when people claim psychological harm needing legal remedy, whether for seeing naked people or having their photos digitally stripped. If you happen to have a daughter upset that there’s a naked photo of her circulating at her school, the best response is to tell her to buck up and shame the shamers, “Everybody has those body parts. And by the way, you only need to be concerned with yours. Your interest in my sexual bits is not mutual. It is excessive and, by the way, you suck… at math.” https://www.google.com/search?q=prurient+meaning&oq=prurient&gs_lcrp=EgZjaHJvbWUqCggBEAAYsQMYgAQyEQgAEEUYORhGGPkBGLEDGIAEMgoIARAAGLEDGIAEMgcIAhAAGIAEMgcIAxAAGIAEMgcIBBAAGIAEMgcIBRAAGIAEMgcIBhAAGIAEMgcIBxAAGIAEMgcICBAAGIAEMgcICRAAGIAE0gEINjQ0NmowajeoAgCwAgA&sourceid=chrome&ie=UTF-8
You are still missing the point. This is not an either/or situation. The 'shamers' obviously must be dealt with, but the victims of these non-consensual and illegal activities are NOT 'whiners', and are CERTAINLY not abusers and harrassers. There is a massive difference between having your photo taken in public and having that photo then turned into a sexual or nude depiction that didn't originally exist. One has far different expectations than the other. We simply don't live in a society where telling people to ignore abusive fakes of them is in any way a proper response. Not only does this legitimize and embolden the actions of the people creating these images, but it tells the victims that their own autonomy and concerns and rights are not respected. You don't have the right to decide how anyone else should feel about THEIR body and privacy of such. Period.
Grok is an AI, it will produce whatever you tell it to produce, it doesn't have a mind on its own.. I've always said that the more you protect something, the easier it is to use it against the same You're protecting… . I know new this talk about consent a lot, but thing is this, yes, if you good on nude beach of course everybody will be naked and who cares, but it's clothing optional. Really a consent? I mean it is as long as there is absolutely fully textile places as well, then you have all the options and you can pick whatever you want or like.
What I actually don't understand is people are so obsessed with sex and sexuality. Not in a sense of the joy of it, but in a sense of being so butthurt about it. That is actually what causes the problem as I already mentioned the more you protect something, the more it will be used against you. Also, it's very funny that in the article it's mentioned women and children once again who gives a shit about men right? And no I don't use AI. I actually hate AI, because of this at all, but because you don't know anymore, what's real and not, I have heard music and sadly very good music made by AI fully, I've seen videos. Mostly these nostalgic '80s vibes that look totally real, so soon we will have movies made by AI.. you won't need actors anymore or bands.. and that's what I'm pissed about, not somebody making some stupid fake nude or xxx photos of whoever.. there's so much of it already that I'm still shocked at. People are so butthurt hurt about seeing whatever. Yet at the same time nobody is asking about the consents we give even to this app, to governments to track your every move, to have digital IDs, to have chat controls, to slowly but surely lose physical money and use only digital.. how about we talk about real fucking problems???
I don’t think reporting on this is alarmist or “butthurt.” Multiple countries are actively investigating X right now. Regulators in Europe, the U.K., India, Brazil, and elsewhere are treating Grok as a serious issue involving nonconsensual sexual imagery and child protection. That alone makes it news. Reporting on active investigations isn’t a moral panic, it’s basic literacy about what’s happening in the world. An AI isn’t a wild animal or a force of nature. It’s a product, built and deployed by a company that controls how it works and how far its outputs spread. We regulate media, pharmaceuticals, vehicles, and other technologies precisely because they cause predictable harm at scale. Treating Grok as something that simply “does whatever it’s told” lets the platform off the hook for designing and amplifying abuse.
This also isn’t about being anti-sex. Consensual sex and sexual imagery aren’t the issue. The issue is nonconsensual sexual imagery generated from someone’s personal likeness and broadcast through a global network. Women and children come up repeatedly because they are, by far, the primary victims of sexualized abuse, including digital abuse, and most of that harm is perpetrated by men. The fact that people in body-positive communities disagree so strongly about whether this matters is exactly why it’s worth covering.
There is no shame in CONSENSUAL nudity. If we lived in a naturist utopia, sure, you might have a basic thread of a point... But it would still be erroneous. We live in a society where people lose careers, family and lives over perceived (or real) displays of nudity, sexualized or not. This technology is becoming indistinguishable from reality for most casual viewers and the last dozen years or so have proven we are now a people that judge first and ask questions later, if at all. So no. This isn't about shame... It's about weaponization and stripping people of their right to autonomy. Even naturists who share other naturist's photos without express permission or credit fall into this category, but doing it to someone that has a reasonable expectation of not falling victim to harrassment and hate crimes?
Oops. Hit send accidentally. To finish... That's an especially egregious crime and should be combatted to our fullest ability.
Fully agree
As long time nudists, my late wife and I had as major rules - respect the life styles of others and cause them no embarrassment.
To electronically strip people of their clothing to present them in a sexualised context is vile and despicable. But, for the Musks of this world , the only factor to be considered is whether or not their is a quid in it for them.
Nothing wrong with showing the whole body, but only if the subject consents. Problem is controlling access to platforms. Australia is trying this with child restrictions, but the kids can manipulate platforms much better than adults. It sounds tedious, but is it possible to encourage subjects to sue for heavy damages if their images are manipulated and appear without consent? States could automatically provide legal aid for this irrespective of income and those reposting would also be liable. I’d like to bankrupt the Musks, but don’t see how this could be done. At least it might get rid of a few of their paying customers.