Originality on Trial: AI’s Challenge to Creative Ownership
Do you feel outraged over AI stealing artist’s work? Or could this be a tragic misunderstanding?
Do you feel outraged over AI stealing artist’s work? Or could this be a tragic misunderstanding?
How would you feel if someone stole something really valuable to you and then, even worse, sold it on, pretending it was theirs? Profiting from their crime.
I would be pretty outraged! I imagine you would be, too.
Some creatives and artists are feeling very angry right now about AI because they feel it’s stealing their work to generate the words, text, and images that it creates.
But what if, these artists and creatives are making a terrible mistake?
Do you understand how AI works? Do they?
Have you ever considered what it really means to be original? to own something? Or to have creative ownership of something?
Having given this some thought, I’m now not so sure if you look at these ideas more carefully, being ‘original’ or having ‘creative ownership’ is quite so simple.
And I also believe when you look at these claims a bit more closely…
You might find, that some creatives are making an absolutely tragic mistake here in their understanding of these concepts and their own originality…
Sarah Silverman & Others Challenge AI
In July, comedian Sarah Silverman and other artists sued Meta and Open AI over claims of copyright infringement by their AI models Llama 2 & ChatGPT.
Key Foundations of the case were:
That the AI models themselves are an infringement of copyrighted material
That every output of these models (e.g. text, images) therefore constitutes copyright infringement
Last week, this case hit a potentially fatal stumbling block, when US federal Judge Vince Chabria dismissed most of this lawsuit.
On the artist’s claim that Meta’s Llama AI model is itself an infringement on copyrighted material, the judge said:
This is nonsensical…There is no way to understand the LLaMA models themselves as a recasting or adaptation of any of the plaintiffs books
On the artist’s claim that every output of these models was also copyright infringement, the judge said:
These claims are dismissed because the artists didn’t offer evidence that any of the outputs could be understood as recasting, transforming, or adapting the plaintiffs’ books or copyrighted materials
This ruling builds upon judgements in other similar cases, for example, other artists suing Stability AI, Mid-journey and other image AI generation companies for similar reasons over image generation and copyright infringement.
So what this developing case law is saying, is that you need to present evidence that the content produced by these AI models is identical to your copyrighted material to claim copyright infringement.
As an AI Consultant, I can tell you this is likely to prove very difficult, if not impossible, to demonstrate in court, and so is a huge blow for anyone seeking to make these claims.
I feel this is the perfect time for us, to take a closer look at some of the beliefs and basic assumptions under all this…
So what is Originality and Ownership anyway?
Originality
So the Cambridge Dictionary defines originality as:
the quality of being special and interesting and not the same as anything or anyone else
So, are you done? Maybe not quite.
How do you decide if something is ‘special and interesting’ or ‘not the same as anything or anyone else’ is there something you can use to measure this precisely?
And consider that anything you create is inevitably going to be influenced by the creativity of others you have been exposed to your whole life, through books, paintings, movies, TV shows, etc.
So the idea that you could be truly original in anything you create, I hope you might agree, is not as easy to know as you might have first thought.
There has been ongoing debate throughout history if originality even exists at all in any meaningful way.
Mark Twain said about originality:
For substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources, and daily use by the garnered with pride and satisfaction born of the superstition that he originated them
Penelope Alfrey in her MIT article ‘Petrarch’s Apes: Originality, Plagiarism and Copyright Principles within Visual Culture’ said:
The problem is that plagiarism permeates everyday life. It is not only accepted, it is encouraged and integral to the creative life. you depend on it, for you learn through copying others and you use it to reinforce social bonds. Thus it is the basis of any art or design training. Yet paradoxically notions of originality are also entrenched in the creative process, as an ideal, no matter how unrealistic
This leads us to consider next, what ownership means…
Ownership
So, back to our Cambridge dictionary, which says about Ownership:
the fact that you own something
Seems simple, right? But again, when you think a bit more about this, doesn’t this seem a bit self-referential?
Penelope Alfrey again said this:
The myth of originality in art and design has considerable commercial value as a selling ploy, but the reality is that copying sustains the economy of commerce, without it, less would be produced, manufactured and consumed, and your works of art would be exhibited
So Alfrey highlights the close relationship between originality and ownership in the context of the modern capitalist system we live in.
She observes how the desire to make money is a powerful driver of our obsession with originality and, consequently, ownership.
And is ownership even a real thing? Historian Yuval Noah Harari for one, might disagree.
In his book Sapiens he highlights many things humans believe are real, are actually not real at all but entirely human-created fictional stories. He said:
The truly unique trait of ‘Sapiens’ is our ability to create and believe fiction. All other animals use their communication system to describe reality, we use our communication system to create new realities. Of course, not all fiction is shared by all humans, but at least one has become universal in our world, and this is money.
I believe ownership which is closely related to money, is also an entirely fictional human construct, with no objective reality beyond the humans that believe in it.
Do you believe that ownership is an objectively real thing if you own a house?
If that was true, then if all humans suddenly disappeared, would the rest of the universe know, or even care that you owned that house?
My hope is that you might now be a bit less certain that originality and ownership are the obvious, objectively real, and simple things than you might have initially thought…
Subscribe to get more free content like this delivered to you each week including podcast episodes, articles, sci-fi stories, online courses and more.
Helping you learn more about how AI is impacting society and humanity.
Does AI steal? Reviewing the Science & the Law
Let’s first understand how AI works…
Checking the Science of AI
The main technology behind modern AI is called a Large Language Model or LLM. So what is this?
In a nutshell, a Language Model is something that can predict the next word in a sequence of words.
You can read more about Large Language Models in this Google introduction.
So, for example, if we have the words:
‘My cat is’
A language model would look at this, and add the next word, for example:
‘My cat is always’
You could then repeat this process, given this as input to the language model, generating an output with the next word.
You can keep going like this as long as you like. For example, to generate an entire sentence such as:
‘My cat is always curious, exploring every corner of the house with wide, bright eyes.’
How can it do this? language models are trained on billions of texts, web pages, articles, books, etc.
When they learn, what they are learning from these examples of texts is how words are put together, learning what words are most likely to follow other words.
A language model is not like a database or a Google Drive; a language model does not copy and paste text and memorize it.
That is not how they work at all.
AI and language Models are made from Artificial Neural Networks, which have more in common with your own brain than a Google Drive or database.
Fundamentally, ChatGPT and other AI models like it are just language models, something that understands how language works and can generate plausible text.
So now you have checked the science of AI, you are now in a good position to go and revisit the law and the claims of the artists.
Checking the law on AI
On Sarah Silverman’s and other artists' claim that Meta’s Llama AI model is itself an infringement on copyrighted material, the judge said:
This is nonsensical…There is no way to understand the LLaMA models themselves as a recasting or adaptation of any of the plaintiffs books
Do you feel the Judge was right? Well, he was, both legally and in terms of the science.
As you now know, AI does not memorise or store the text it sees, it simply uses these texts to learn the relationships between words, to understand language so it can generate plausible text.
So this claim by Sarah & the other artists was completely nonsensical both scientifically and legally.
On the claim that every output of these models was also copyright infringement, the judge said:
These claims are dismissed because the artists didn’t offer evidence that any of the outputs could be understood as recasting, transforming, or adapting the plaintiffs’ books or copyrighted materials
Do you believe that our legal system should be one based on evidence? I hope you do.
Can you imagine a legal system that wasn’t based on evidence?
Well you don’t need to imagine, you just need to remember history.
You can remember, for example, the burning of the witches, or the Spanish Inquisition, justice systems based on what people simply felt or believed was right, which sentenced innocent people to the most horrible deaths.
So I would hope you would agree, that the Judge’s conclusion here that evidence is required to support a claim, was also completely correct.
So what’s is really going on here?
Summing up: AI, Originality & Ownership
Some artists and workers in various fields are very worried about the impact of AI on their work and jobs.
I can completely understand the fear and even anger people might have about losing their jobs to AI, and I have written about this.
But when you stay very fearful or angry, do you make the best decisions?
Is that the best place for you to understand a complex situation or for you to think carefully & rationally?
I think you know the answer, is no.
I believe this also explains why Sarah Silverman and other artists are making these legal claims with little foundation or evidence.
These claims seem motivated more by fear and anger than by careful thought, understanding, reasons or evidence
And I feel this is absolutely tragic.
So is AI stealing the work of artists? I would say no, not anymore than every artist has stolen and copied the work of other artists throughout history.
In a recent article, I discovered that many artists and creatives actually seem to understand AI, use it, and even feel quite optimistic about it.
I also feel that, unfortunately, some artists and creatives seem to have not thought very deeply about the complexity, nuances, and interplay between culture, creativity, individualism, capitalism, originality, AI, and ownership, especially in Western cultures.
I hope you might agree that Western individualistic and capitalist values shape the perception of creativity as a commodity in our culture, which is different to the way other cultures see these things.
Tristan Wolff wrote an insightful article about this topic recently called ‘Artificial Intelligence & The Misconception Of Creativity’ In which he said:
AI threatens creative egos. People who identify so strongly with their supposed creativity that they feel personally threatened by a machine intelligence that appears to deliver creative results. In this sense, the threat is an illusion. It is an ego trip.
I feel he makes an interesting observation, do you?
I hope you might see as I do, that with AI we are at an inflexion point in human history.
Generative AI is bringing new focus to longstanding issues about identity, purpose, originality, and ownership.
The worst-case scenario involves fear, ignorance and resistance to AI’s impact, continued obsession with ourselves, ownership, profit, and illusions about our own separateness and unique originality.
The best-case scenario offers a chance to value our own creativity alongside the creativity of others, humans, and AI, and learn more about AI and understand how it works and how it can help us.
This could be an opportunity for us to be more focussed on actually being creative, and acknowledging more honestly the influence the creativity of others has on the creativity of all.
I believe whatever happens with AI, as long as there are humans, there will always be a need for humans to create for other humans, and for humans to speak to humans through art and creativity.
As a creative professional I discovered recently said about AI:
I think that AI is an unstoppable tech force that needs to be embraced by the creative industry in the same way previous tech advances have been. I also believe that the value of genuine human innovation and creativity will be enhanced by the rise of AI as the world gets overloaded with AI content. Be more human.
My hope is, that more artists, creatives and you, will understand this too.
So I encourage you to be more human, create more, understand AI better, and have more faith in the value of the contributions you can make as a human in the age of AI.
But what’s your perspective? Do you agree? Do you feel artists are right to be angry? Or do you have a very different perspective?
I’d love to know what you think whatever that is, let me know in the comments and let’s continue this important discussion about AI originality and creative ownership.
I use AI as part of my writing process, as a tool to help improve my own writing and more. Like to learn how to use AI to help your writing & more?
Ownership of intellectual property is colonial and invasive to begin with, though I can understand its need in an increasingly digital economy. However, what the economy protects is the investment of time it takes to produce works of art, not the art itself. The acquisition of that time has, until now, been a privilege of those with the luxury to afford an artistic lifestyle, or education, often (though not always) representative of an elite or middle-class viewpoint and bias. The fundamental thing AI creates is operational leverage — ie. time to do more and work less. That means the time invested in art no longer needs to be scarce. To me, the trend indicates that time becomes less valuable to the economy as it becomes more accessible to society. Think I’m talking shit? I urge you to consider the implications very carefully!
In the spirit of the post, I am stealing from here: https://quoteinvestigator.com/2013/03/06/artists-steal/
An intriguing precursor appeared in an article titled “Imitators and Plagiarists” published in The Gentleman’s Magazine in 1892. The author was W. H. Davenport Adams, and the terminology he used was transposed: “to imitate” was commendable, but “to steal” was unworthy. Adams extolled the works of the famed poet Alfred Tennyson, and presented several examples in which Tennyson constructed his verses using the efforts of his artistic antecedents as a resource. In the following passage Adams referred to his aphorism as a “canon”, and he placed it between quotation marks.
Of Tennyson’s assimilative method, when he adopts an image or a suggestion from a predecessor, and works it up into his own glittering fabric, I shall give a few instances, offering as the result and summing up of the preceding inquiries a modest canon: “That great poets imitate and improve, whereas small ones steal and spoil.”