Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.
Yes copies were made. The files were downloaded, one way or another (even as a hash, or whatever digital asset they claim to translate them into) then fed to their machines.
If I go into a Ford plant, take pictures of their equipment, then use those to make my own machines, it’s still IP theft, even if I didn’t walk out with the machine.
Make all the excuses you want, you’re supporting the theft of other people’s life’s work then trying to claim it’s ethical.
They were put on the Internet for that very purpose. When you visit a website and view an image there a copy of it is made in your computer’s memory. If that’s a copyright violation then everyone’s equally boned. When you click this link you’re doing exactly the same thing.
By that logic I can sell anything I download from the web while also claiming credit for it, right?
Downloading to view != downloading to fuel my business.
No, and that’s such a ridiculous leap of logic that I can’t come up with anything else to say except no. Just no. What gave you that idea?
Because this thread was about the companies taking art feeding it into their machine a D claiming not to have stolen it.
Then you compared that to clicking a link.
Yes, because it’s comparable to clicking a link.
You said:
And that’s the logic I can’t follow. Who’s downloading and selling Rutkowski’s work? Who’s claiming credit for it? None of that is being done in the first place, let alone being claimed to be “ok.”
Because that is what they’re doing, just with extra steps.
The company pulled down his work, fed it to their AI, then sold the AI as their product.
Their AI wouldn’t work, at all, without the art they “clicked on”.
So there is a difference between me viewing an image in my browser and me turning their work into something for resell under my name. Adding extra steps doesn’t change that.
If you read the article, not even that is what’s going on here. Stability AI:
So none of what you’re objecting to is actually happening. All cool? Or will you just come up with some other thing to object to?
But they did.
(I’m on mobile so my formatting is meh)
They put his art in, only when called out did they remove it.
Once removed, they did nothing to prevent it being added back.
As for them selling the product, or not, at this point, they still used the output of his labor to build their product.
That’s the thing, everyone trying to justify why it’s okay for these companies to do it keep leaning on semantics, legal definitions or “well, back during the industrial revolution…” to try and get around the fact that what these companies are doing is unethical. They’re taking someone else’s labor, without compensation or consent.
Here is where a rhethorical sleight of hand is used by AI proponents.
It’s displayed for people’s appreciation. AI is not people, it is a tool. It’s not entitled to the same rights as people, and the model it creates based on artists works is itself a derivative work.
Even among AI proponents, few believe that the AI itself is an autonomous being who ought to have rights over their own artworks, least of all the AI creators.
I use tools such as web browsers to view art. AI is a tool too. There’s no sleight of hand, AI doesn’t have to be an “autonomous being.” Training is just a mechanism for analyzing art. If I wrote a program that analyzed pictures to determine what the predominant colour in them was that’d be much the same, there’d be no problem with me running it on every image I came across on a public gallery.
You wouldn’t even be able to point a camera to works in public galleries without permission. Free for viewing doesn’t mean free to do whatever you want with them, and many artists have made clear they never gave permission that their works would be used to train AIs.
Once you display an idea in public, it belongs to anyone who sees it.
For disclosure I am a former member of the American Photographic Artists/Advertising Photographers of America, and I have works registered at the United States Copyright Office.
When we put works in our online portfolio, send mailers or physical copies of our portfolios we’re doing it as promotional works. There is no usage license attached to it. If loaded into memory for personal viewing, that’s fine since its not a commercial application nor violating the intent of that specific release: viewing for promotion.
Let’s break down your example to help you understand what is actually going on. When we upload our works to third party galleries there is often a clause in the terms of service which states the artist uploading to the site grants a usage license for distribution and displaying of the image. Let’s look at Section 17 of ArtStation’s Terms of Service:
This is in conjunction with Section 16’s opening line:
So when I click your link, I’m not engaging in a copyright violation. I’m making use of ArtStation’s/Epic’s license to distribute the original artist’s works. When I save images from ArtStation that license does not transfer to me. Meaning if I were to repurpose that work it could be a copyright violation depending on the usage the artist agrees to. Established law states that I hold onto the rights of my work and any usage depends on what I explicitly state and agree to; emphasis on explicitly because the law will respect my terms and compensation first, and your intentions second. For example, if a magazine uses my images for several months without a license, I can document the usage time frame, send them an invoice, and begin negotiating because their legal team will realize that without a license they have no footing.
I know this seems strange given how the internet freely transformed works for decades without repercussions. But as you know from sites like YouTube copyright holders are not a fan of people repurposing their works without a mutually agreed upon terms in the form of a license. If you remember the old show Mystery Science Theater 3000, they operated in the proper form: get license, transform work, commercialize. In the case of ArtStation, the site agrees to provide free hosting in compensation for the artist providing a license to distribute the work without terms for monetization unless agreed upon through ArtStation’s marketplace. At every step, the artist’s rights to their work is respected and compensated when the law is applied.
If all this makes sense and we look back at AI art, well…
Training an AI doesn’t “repurpose” that work, though. The AI learns concepts from it and then the work is discarded. No copyrighted part of the work remains in the AI’s model. All that verbiage doesn’t really apply to what’s being done with the images when an AI trains on them, they are no longer being “used” for anything at all after training is done. Just like when a human artist looks at some reference images and then creates his own original work based on what he’s learned from them.
Copies that were freely shared for the purpose of letting anyone look at them.
Do you think it’s copyright infringement to go to a website?
Typically, ephemeral copies that aren’t kept for a substantial period of time aren’t considered copyright violations, otherwise viewing a website would be a copyright violation for every image appearing on that site.
Downloading a freely published image to run an algorithm on it and then deleting it without distribution is basically the canonical example of ephemeral.
Its what you do with the copies thats the problem, not the physical act of copying.