In early January, there was a great post here on Planet Nude about AI’s anti-nude bias in art. Many, many, many good points were made, both in the post and in the comments. At the time, I was alone in expressing concern about deriding safety features in AI that censored nudity. I wrote, “When I think of where revenge acts are now, an unchained AI could ruin a person in minutes.”
On January 26, headlines broke all over the internet about fake sexually explicit AI images of Taylor Swift. The White House said they are alarmed. Satya Nadella, Microsoft CEO, responded that guardrails need to be placed around AI technology. SAG-AFTRA released a statement calling for these kinds of images to be illegal. I haven’t seen these images of Taylor Swift; frankly, I don’t want to because I can quite easily see in my head what they likely look like.
Addressing the problem
As we have conversations here about body freedom and acceptance, and nuanced conversations about how sexuality interweaves with nudism, the conversation about how nude bodies are treated in the world outside of the safety of this community is necessary and lacking. When the person who is Time’s “Person of the Year,” following the biggest concert tour of all time across the Americas, coining its own term “Swift-onomics,” can be victimized in this way, what does it say for the rest of women? For those of us who don’t have an army of Swifties to defend and take down pictures and where the law hasn’t (and likely won’t be able to) catch up with the evolution of technology?
Is one person’s freedom worth the danger presented to another? This is the next step in the dystopian version of AI. These are not pasting someone’s head on top of a body from a magazine. AI is so advanced that there is a blurring of reality to the extent that the humanity of another person is lost. When we talk about rape culture, this is what we mean, and it is deeply enmeshed with the naked body.
Taking back the power
Despite all of this, there is a strong antidote in taking back the power. Taylor Swift has certainly had first-hand experience with this. After Kanye’s music video “Famous” showed her lying in bed naked next to him, Taylor hit back with a nude bodysuit in her “Ready for it?” video, and it was a stunning visual display of strength.
I believe we all know to some degree how powerful it is to claim our own bodies and also how harmful it is to have them degraded. The question I put out to all of you: What is your voice and Where is your voice in all of this?
It doesn’t take a prophet to see that it’s only a matter of time before these issues become part of an agenda. Those behind it will manipulate these issues, cloaking themselves in a guise of righteousness. This will likely lead to more legislation against nudity. If you don’t speak out now to help others, how can you expect to change the narrative when the issue comes around again?
I know this won’t be a popular post. But it’s a necessary one. 🪐
It would great to take the bull by the horns here and promote a universal law of body autonomy. The universal law of body autonomy would protect your right to your body (against rape, circumcision, anyone other than yourself determining your gender, and much more), your right to be as dressed or as nude as you want, and also your right to use the image of your body, like a copyright. If someone else steals your image then it could be criminalized like a copyright violation or worse.
Thank you for calling attention to this. This feels like such an obvious perversion of consent and autonomy. Whether the images are real or not—or whether digitally manipulated or created by AI—no longer matters if technology is good enough to make us believe the images might be real, good enough that the people consuming the images don’t care if it’s fake because it looks real enough, good enough to violate the privacy and safety of the affected real world person.