April 8, 2024
Discourse on Consent: Artists Against AI
by Jessica Lu, SASC Volunteer
If you are reading this I hope you know what consent means (if not, that’s okay! Learn about consent here). Whether it be consenting for sex, asking if you could take someone’s photo, or accepting someone into your private Instagram, knowing how to navigate consent is an asset for everyday social interactions.
When I first volunteered at the SASC, we had a discussion on the violation of consent in terms of colonisation (this would be a great blog post for another time). It got me thinking— what other areas of my life and contemporary history are extremely relevant to consent? As an artist, my mind came to the “Artists Against AI” movement online. In brief, this movement came into being in 2022-2023 when AI art programs rose in popularity amongst the general population. Many artists began recognizing aspects of their work in these so-called “AI artworks” and protested over the fact that AI programs sample from digital artworks without the original artists giving their consent, receiving compensation, or being credited (the “three C’s”). I began pondering how exactly consent is violated in the generation of AI art and learned that common AI art programs operate on the basis of art theft.
To better understand how AI art is weaponized against human artists, let’s imagine a scenario:
At a shallow level of usage, say, you want to see how Van Gogh would paint you in his style for solely entertainment purposes. You click “generate image” and have a good laugh at what the AI program pops out for you. Maybe, next, you post the image on your Instagram and caption it with “AI Van Gogh paints me!” Nothing wrong with crediting AI, right?
What if you leave out the fact that the image was created by AI? Some people may still be able to tell, but others might assume that you were the one that painted it. If you’re posting on an account with a larger following, it’s very likely that your audience will believe that you commissioned someone or that you drew yourself in Van Gogh’s style. Is this wrong?
People can generate many AI images, post them online, and easily claim them as their own. Of course this isn’t limited to “self portraits in Van Gogh’s style”— when prompted by even more generic keywords, AI takes from a larger database that include legitimate artworks and designs from human artists. Not only can AI try to copy certain styles of art, it can also copy and paste exact elements of a drawing into another. When AI users generate images, they’re not simply inputting words into a magic creative machine that can create original artworks— they are contributing to the theft and plagiarism of real, human artists who put time and effort into their work.
“AI art, at first glance, may look like a fun tool that makes art accessible to all. The problem that lies beneath this fun tool is the ethicality behind it… Just because the art is free to view on the Internet doesn’t mean it is free to use, but AI generators ignores this, which becomes even more of a problem when so called ‘AI artists’ post their generated content on the Internet as if it were their own work, even going as far as taking commissions, selling the generated work, and so on. AI art would be fine if there was a way for artists to give their consent about their art being used as a way of training the AI, and as long as we could trace back the artist and give credit to them… As a professional artist, knowing that someone may be using my art and selling it while I paid for my art education is very painful. Living as an artist is already hard these days, especially with the inflation and the cost of life, so ‘AI artists’ rising are another obstacle to having a decent everyday life.”
—Maeve (@atelier_de_taffy), Independent Artist and WEBTOON France colourist
Some good news is that people have taken the “Artists Against AI” movement very seriously and have come up with ingenious methods to protect artists from having their work stolen in the future. One example is a program called Nightshade that allows you to make a copy of your digital art so that the version you post online is unable to be sampled by AI programs— it “corrupts” the “invisible code” of your original artwork in a way that appears illegible to AI programs. If AI were to sample from a Nightshade copy, it would produce visual gibberish. Others have attempted to file lawsuits against AI companies (with mixed results).
As I explored this topic, I began to see how AI art theft had parallels to sexualized violence. There are certainly direct connections of AI to sexualized violence (e.g. revenge porn, fulfilling sexual fantasies, etc.), but even deeper themes are present. For instance, how does AI replicate misogyny? How does the creation of AI art decontextualize art from its original meanings? If you put a lot of effort into creating something, how would you feel if someone reaped the benefits of it without your knowledge or consent?
In particular, I thought a lot about the question regarding the decontextualization of art, and it really creeped me out. There’s this idea of “separating the art from the artist”… but in fact, AI art is stripping the artist away from their art. Art without meaning—without human intention—to me, is just a bunch of randomly generated pixels. When you take away an artist’s unique ability to create something and give it to a machine, what are you really creating? There is no context— aside from the few words you prompt the AI program with. Just like how sexualized violence is often weaponised against victims by taking the situation out of context, the full autonomy of artists is taken away through AI art theft.
Another theme I’ve noticed emerging from discourse on both topics is victim blaming. Supporters of AI art often rationalise their standpoint by telling human artists that AI is the future of technology and if they can’t keep up with the competition, it’s their own fault. AI supporters don’t tend to view the situation from the artists’ perspective or take the situation seriously (e.g. “Art isn’t a good career anyways”). Similarly, it’s extremely difficult for bystanders to understand how one’s life can be drastically affected by sexualized violence and how trauma can emerge from any single instance unless they’ve experienced it themselves. In an ideal world, no one would be subject to having their livelihoods threatened—whether it be by AI or perpetrators of violence—but we live in a reality where these things do happen and victims are told to shoulder the responsibility to protect themselves from it.
I wrote this blog in an attempt to shed light on how making broad connections could impact the way we understand two different topics and their respective struggles and solutions. The way we engage with AI art, our attitudes towards it, and our attitudes towards the perpetrators and victims ultimately shape the future of digital art and related discourse on consent.