Midjourney’s intolerance of nudity is anti-art
If A.I. art is here to stay, the censorship must be addressed
“The body always expresses the spirit whose envelope it is. And for him who can see, the nude offers the richest meaning.” —Auguste Rodin
“Nudity is a problem for Americans. It disrupts our social exchange.” —Eric Fischl
Have you heard of Midjourney? If not, don’t click away until you read the next sentence. Midjourney is the name of one of several popular AI art generators. It uses machine learning to create images from descriptive text prompts. It is a popular tool among some artists (and wannabe artists), but also hotly disavowed by others. While there are many valid arguments to be had about many different facets of this technology, let me just state up front that this article is not seeking to weigh in on the ethical implications of AI art. We believe that regardless of how we feel about it, this technology is here to stay. Instead, this article aims to examine and challenge the inherent biases of Midjourney's AI platform and the implications of its anti-nudity policies for artists and the broader public.
Midjourney's content policy, as stated on their website, bans any content that is deemed "inherently disrespectful, aggressive, or otherwise abusive." This includes violence, gore, adult content, and anything that can be viewed as racist, homophobic, or derogatory to a community. To prevent non-PG-13 images, Midjourney has implemented a system that automatically filters and bans exact words or similar words. Some examples of these banned words are to be expected, things that might result in images resembling child porn or explicit gore, for instance. However, in attempts to nip pornography in the bud, the platform also bans many benign terms loosely related to to nudity. Some actual banned words of this nature include: bare chest, bra, clear, cleavage, full frontal, invisible clothes, lingerie, naked, negligee, no clothes, no shirt, pleasure, nude, risque, scantily clad, stripped, unclothed, wearing nothing, with no shirt, without clothes on, zero clothes.
While this is by no means a comprehensive list (the list frequently changes), it’s easy to see how these types of censored words can begin to form inherent biases that unfairly limit female body parts and nonsexual, human nudity.
Midjourney's anti-nudity policies are based on a narrow and outdated view of what constitutes "appropriate" content. The company's rules state that users should "avoid making visually shocking or disturbing content," but it is not clear what qualifies as "shocking" or "disturbing." The over-broad banning of all nudity is oppressive and heavy-handed, and the vagueness of the policy has led to a number of images being flagged or removed for reasons that are not entirely clear.
While Midjourney's policies may seem well-intentioned, they also perpetuate societal biases and shame around nudity. This censorship not only limits the artistic freedom of creators, but also perpetuates harmful societal norms surrounding nudity and the human body, even amplifying these biases in terrifying ways.
One frightening example of this effect is presented in an article on slashdot.org that was shared on Thursday September 08, 2022 titled, Horrifying Woman Keeps Appearing In AI-Generated Images.
The article is about an AI artist named Supercomposite who discovered an image of a woman, known as "Loab," appearing in AI-generated images from certain queries. The artist used a technique called "negative prompt weights," which tells the AI system to generate the opposite of whatever is typed into the prompt. In this case, when using negative-weight prompts on certain words, the image of Loab appeared. The images went viral on social media, leading to speculation on what could be causing this phenomenon. Supercomposite claims that the generated images derived from Loab's image almost universally veer into the realm of horror, graphic violence, and gore. It's unclear which AI tools were used to generate the images, and Supercomposite did not elaborate on which model it is, but confirmed that Loab exists in multiple image-generation AI models.
The appearance of the "Loab" image in AI-generated images raises a number of implications related to body biases, shame and phobias. The AI model may be reflecting the biases in the data it was trained on. If the model's training data contains mostly images of individuals that conform to a certain standard of beauty or body type, it may generate images that reflect those biases. This could lead to the AI model perpetuating harmful stereotypes and reinforce societal biases. The emergence of the "Loab" image could also be a reflection of societal attitudes towards women and the female body. The fact that the generated images of Loab are described as disturbing, grotesque and featuring graphic violence and gore, it could be seen as a manifestation of societal shame and phobias related to women's bodies and the female form.
Another article from theverge.com, July, 2022 titled An Experimental Horror ARG Is Testing the Boundaries of AI Art, highlights the issues of censorship and moralism that have arisen with the use of AI art platforms such as Midjourney. The creator of the alternate reality game (ARG) "Year Zero" and the "VIIR" project on Twitter, Rob Sheridan, discusses his experiences with the Midjourney bot, which initially sparked his imagination but later had to be creative with due to certain key conceptual terms being banned by the platform. This censorship, along with the content restrictions on other private AI art platforms, has an impact on the way that people tell stories using these text-to-image generators. There is also an underlying current of moralism that imbues art discourse today, which echoes the puritanical dystopia of "Year Zero" where art is a form of resistance and crackdowns on "disobedience" and "subversive" materials. This moralism is a concern as it can limit the potential of these platforms and the creativity of the artists using them.
The underlying moralism that pervades AI art technology is a reflection of society's attitudes towards nudity, which are often shaped by cultural and historical perspectives. In the case of Midjourney, the platform's strict rules and regulations around nudity have been implemented in order to avoid any potential legal issues or controversies. However, this has led to a form of censorship, where nudity is not allowed on the platform, even if it is being used as an artistic expression. This moralistic approach to nudity is a form of censorship, limiting the ability of artists to express themselves fully and creating a barrier for those who wish to use nudity as a form of artistic expression.
Of course, the companies that produce these AI models are well aware of the problems with algorithmic biases, and the ways they can perpetuate and amplify societal biases and stereotypes if they're not trained on diverse and representative data. It is important for researchers and companies to consider and address these biases in the development and training of AI models to minimize potential harm.
Art has a long history of using nudity to challenge the norms and restrictions set by the powers that be. Artists and activists can—and should—resist Midjourney’s (and others’) anti-nudity biases and challenge these restrictive policies vocally with the platform itself. It is crucial for artists to continue to resist these biases and push for a more inclusive and diverse representation in AI art. 🪐
I'm not a big Ayn Rand fan, but she said a couple or three good things. One was that the way artists portray the human body betrays what culture thinks humans are. (I always imagine she was complaining about de Kooning.) Kenneth Clark pointed something out about the nude in art: "The nude does not simply represent the body, but relates it, by analogy, to all structures that have become part of our imaginative experience." You put those two ideas together with the alienation from nature represented by our alienation from our naked bodies, and you get a perverse and destructive mass "imaginative experience" that goes way beyond the images you censor.
This is perfect example of the old coding maxim bad input = bad output.