In early January, there was a great post here on Planet Nude about AI’s anti-nude bias in art. Concerns were raised about AI censorship, including my caution about the dangers of AI misusing nudity. On January 26, this concern became real with fake explicit AI images of Taylor Swift surfacing online, alarming the White House and prompting Microsoft CEO Satya Nadella to advocate for AI guardrails. SAG-AFTRA called for a ban on such images. This incident underlines the need for a deeper conversation on body freedom and how nudity is perceived outside safe spaces. It raises questions about the safety of women in an era where technology outpaces law, especially for those without extensive support systems. 🪐
Read the original article:
Deepfake dangers
In early January, there was a great post here on Planet Nude about AI’s anti-nude bias in art. Many, many, many good points were made, both in the post and in the comments. At the time, I was alone in expressing concern about deriding safety features in AI that censored nudity. I wrote, “When I think of where revenge acts are now, an unchained AI could ruin a person in minutes.”
Share this post