Do AI Art Tools Break Copyright Laws? Two New Lawsuits Will Find Out.

An illustration of a robot painting on a canvas.
An illustration of a robot painting on a canvas.


This was presumably drawn by a human being.

To build an AI art generator, engineers train their algorithms on large databases of photos, drawings or graphics. A lot of the most popular AI art tools got their databases by scraping content from the web, often without explicit permission from the artists who created images. Now the artists are asking: Did the algorithms violate copyright law?

The courts don’t have an answer, but they will soon. The AI art industry now faces two lawsuits, one from artists in the US and the other from Getty Images in the UK, arguing that AI art generators stole billions of images in violation of intellectual property rights. It’s the beginning of an oncoming wave of legal action that will shape the future of AI tech.

Read more

Three artists filed a class-action lawsuit in San Francisco against AI art tools built by Stability AI, Midjourney, and DeviantArt. Stability AI faces a second case filed by Getty in the UK.

“AI image products are not just an infringement of artists’ rights; whether they aim to or not, these products will eliminate ‘artist’ as a viable career path,” according to a statement from Joseph Saveri Law Firm, representing the artists.

A statement from Getty Images isn’t quite as sour on AI, suggesting that “artificial intelligence has the potential to stimulate creative endeavors.” But Getty’s legal claim is similar. Stability AI “chose to ignore viable licensing options and long-standing legal protections in pursuit of their standalone commercial interests,” the company said.

A lot of artists have complained that AI art tools spit out images that appear to blatantly copy their styles. Getty Images can make a similar claim. Some AI art tools actually include the Getty Images watermark in the graphics they produce, making it plain how big of a role Getty’s intellectual property plays in the algorithms.

“Anyone that believes that this isn’t fair use does not understand the technology and misunderstands the law,” said a Stability AI spokesperson, in reference to the artists’ class-action lawsuit. The company declined to comment on the Getty Images lawsuit, saying it hasn’t received any documents about the case. Midjourney and DeviantArt did not immediately respond to requests for comment.

It’s likely that this just the begin of a drawn out legal battle with a lot more cases on the horizon. Though you might count on US courts to lean in favor of corporate goals, there are giant corporations on both sides of this fight. As of now, the owners of the source material in the enormous databases that forms the building blocks of AI tools aren’t compensated or even consulted.

AI content generators could replace large swaths of human labor, and a number of companies are already turning to AI tools as a cheap or free alternative to flesh-based content creators. The publication CNET, for example, has been publishing articles written by ChatGPT for months. (OpenAI, the maker of ChatGPT and the image generator DALL-E-2, has escaped lawsuits so far, in part because it’s less clear where the company got the content used to train its algorithms.)

Individual artists look at each other’s work and draw inspiration to create new pieces. Iterating and building on collective ideas is how art works—perfectly legal, often even blatant ripoffs. The question is, are AI image generators just doing the same thing, or are they breaking the law?

There are good arguments in either direction. When you set your algorithm’s digital sites on a set of images, you could say you’re just using a computer program to recreate the processes of the human mind, creating new cultural content based on what you learned from old cultural content. On the other hand, these AI tools use artists’ work in a much more direct way. Through statistical analysis, other people’s art is quite literally built into the algorithm.

More to the point, a lot of artists have complained that AI art tools spit out images that appear to blatantly copy their styles. Getty Images can make a similar claim. Some AI art tools actually include the Getty Images watermark in the graphics they produce, making it plain how big of a role Getty’s intellectual property plays in the algorithms.

The issue gets at another legal problem already working its way through the legal system. The Supreme Court is set to rule on a case against the Andy Warhol foundation, alleging some of the pop artist’s work, which often incorporates other people’s photographs, violates copyright law. The question there is essentially how and to what extent you can harness someone else’s intellectual property before you need permission. Judges may be forced to take on the role of art critics, adjudicating on the similarity between two pieces of art.

Some of the players in the AI art lawsuits are already trying to address these ethical and legal concerns. Getty Images doesn’t allow AI generated art on its platform to avoid copyright issues. Contrast that with Getty’s competitor Shutterstock, which said it will license AI art but plans to compensate artists whose work contributed to the algorithm. DeviantArt lets its users choose whether their work is incorporated into its AI art generator, called DreamUp. (But because DreamUp harnesses Stability AI’s Stable Diffusion tool, which is built on images scraped without explicit permission from copyright holders, DeviantArt’s user consent tool doesn’t factor into the complaint against the company.)

Update 1/17/2023, 4:35 p.m. ET: This story has been updated with a comment from Stability AI.

More from Gizmodo

Sign up for Gizmodo's Newsletter. For the latest news, Facebook, Twitter and Instagram.

Click here to read the full article.