If I see the false and popular statement that it's cutting up many parts of other people's work and pasting them together collage-like, combined as it always is with some sort of argument in favor of intellectual property, I ignore the rest, because I know they're clueless.
Number one, intellectual property is bogus in any context. There's nothing wrong with sampling on principle. Number two, generative AI does not sample, much less plagiarize except when trained badly and that is not something that happens. Generative AI creates novel work by capturing real meaning.
Fair enough. I think "harvest collate and absorb before spitting out variations" isn't such a clueless precis, though I agree "copy paste, collage" would have been a poor analogy. Where do you see the ref to intellectual property?
Thanks for trying with the alt. All it really is is describing the image as if to your friend on the phone who can't see it. So the basic action going on in the drawing, plus the actual text in full. Don't get in the "I'm trying a new thing and messing up" mode, just use your pre existing skills 🫡
The complicated part here is people openly publishing their art on the web but somehow someway an AI duplicating their style is worse than a person imitating their style. And feel free to pour the hate on me in your inimitable style, but maybe what we really need is the #norobots tag for web art?
Ugh... And my fear is when ChatGPT stuff can emulate passable, mediocre, soulless copies of writers... which struggling writers will end up being paid a pittance to clean up and make sound human.
Humans are derivative of their own reshuffled DNA. Babies become writers and painters, evolving to transcend what they are influenced by, to insert that creative spark, or eye of newt. By the process of natural selection, only the most creative works of the most unique artists propagate. Evolution.
Most of it is not great for now, but with rapid advancements in tech, I doubt it will be for long. I suspect that future authors could very well be writing holodeck style adventures. Cleverly writing instructions for the AI to give voice to characters and adventures that take many possible paths.
Without the holodeck tech, they may even do the same for written format allowing people to request a story or novel with their favorite author's world and characters.
I'm not a scientist, but there's got to be a mathematically predictable point at which the AI trained on X amount of human work quickly auto generates/publishes X-squared amount,& progressively scrapes & trains increasingly on only other AI generated work,until it's only feedback,& decoheres, right?
Actually, the best way to mess up an AI is by feeding it training content that was made by AI as well. :D This will specifically amplify the issues and weaknesses of AI-generation. It's a feedback loop.
I feel like the fictional thing that captured the seeming omniscience AND idiot-blindness/non-persistence/seeming-but-not-consciousness-exactly of immense data-trained AI was the oracle from BATTLESTAR GALACTICA, but it's been a minute since I've seen it.
It's what is leading to what many call the "junkification" of the internet, where the majority of original content is buried under the endless sea of useless ai garbage.
Well, the idea is that given enough monkeys and enough time it will eventually happen simply by pure chance - but it will never be reliably reproducible.
AI "artists" do the same, they keep generating dozens and hundreds of images until one of them, by pure luck, comes out looking alright.
Real artists, on the other hand, may also do hundreds and thousands of drawings as part of the learning, but their skills will continuously and reliably improve and they can reproducibly create art of increasing quality.
Meanwhile, the AI will always be hit or miss, no matter how much you train it.
what I mean is that the whole scheme is blindly constructive, w/no gestalt awareness. It relies on the inherent gestalt of the source trained-on "fossil fuel" of human-made art, to (no pun intended) ape it by process. But that gestalt gets watered down, iteratively as it feed on AI output.
like, it poisons the well as it copies the well water to create *almost water.* then it blindly copies the *almost water* as "kinda-almost water." Multiply until the well is full of magma and liquid methane.
Somebody just "liked" this older post, bringing it to my attention and I feel like Arnold in TOTAL RECALL getting a message from his past self. I'm like, "Great point, previous-me! I wish I had said that! Oh, wait, I did."
Fun fact: without all the pen-boys (and other pen-people), AI's got nothing to work with. Putting artists out of work puts AI image generators out of work.
gotta say i tilted my head a bit at the creative choice to put the words of every crypto enthusiast on twitter into the mouth of a fat woman in particular but i agree with the message overall
We're approaching a crossroads where we need to figure how people are given attribution and paid. However, virtually all art, going back thousands of years has built upon previous artists work. The difference now seems to be that its "too easy", but then some dude sold a taped banana to a wall.
Who is the artist, or author when AI generates a facsimile of another creator's work? Is it the coder(s) who wrote the AI? The corporation that owns it?
Artists being inspired by and referencing the works of artists before them is absolutely, completely different than what AI training does. Please, stop making this comparison, because it's just not right.
A small but important critical distinction: we aren't building upon previous work, we are in a conversation with other artists, living & dead. Sometimes the reference is a nod & a wink, sometimes affectionate, sometimes oppositional, tearing down the house. But we know what we're doing & why.
"Virtually all art going back...has built upon previous artists work..." clearly you don't go to enough art shows. Artists break new ground constantly. That's just your own short sightedness. We are inspired by intangibles, forms... even visual art inspired by music is different by each artist.
The banana duct taped to the wall was audacious enough to get people around the world talking, and it only worked because it came from an established artist. Saying it’s comparable to AI because it’s “easy” misses both the context it was created in and what the actual criticisms of AI are are.
Who inspired Neil? Who did he study, actively or passively to develop his own skills and unique style? Why have we asked nearly every musician in an interview "who inspired your style of music?" without demanding they pay royalties?
AI is spurring new scrutiny, for to me AI is just another tool.
Getting inspired by someone to do your own thing with your own hard-earned skills and taking hundreds of artist's work and type in a prompt or two to create a Frankenstein's monster to parade around as your own creation could not be two more different concepts. A pencil is a tool, pick one up.
AI is just an advanced form of an instruction set. It doesn't take inspiration - an emotional response - nor does it apply any critical thought or emotional content to what it compiles.
Instead in just copies/steals from what it's been given as input, and approximates based on that instruction set.
So, arguably, those authors were paid for their work when their books were purchased by the readers they inspired. They are often given credit by those authors, thus directing readers to their work.
To my knowledge, this hasn't happened with AI training... the opposite, in fact.
But Neil has the talent to make a unique style. AI and it's users are not talented so simply steal what other talented people create. As a result it's just a cheap, nasty copy with no merits whatsoever. You are like the NFT scammers trying to get rich off others.
I see AI as a means for people to put their inspiration into a medium where previously it was impossible for them to do so. GarageBand for example lets me make music without the knowledge of instruments.
This is separate to ensuring artists are paid for their materials that are used to train AI.
If you want to "make music" without knowing instruments, you can do so without theft of others materials by using tools that allow you to use what IS intrinsically yours (you voice) to sing a line and say "apply viola, or cello, ir electric guitar... use layers - create music! Those tools exist!
The problem here is that we don't know what machine learning is capable of yet. I don't see inspiration as something that can't be replicated by non-organic processes. Lots of people are dumping on AI because they think it can never be as good as us and will do bad work. Time will tell.
People are concerned about AI for many reasons. But don't conflate them as they're root fears are different, though equally valued. Stealing content. Losing jobs. Inaccuracy creating MORE false information. Skimming. There are many good reasons we should be regulating some uses of AI.
It's crazy to me that the tumbler-machine-generated novels for the proles from 1984 are the technology things Orwell most accurately in-the-yellow-of-the-bullseye-predicted.
When I was 16 & read it, that seemed the most "never could be in the real world" thing.
AI uses other people's art to generate poorly put together creations.
If real people hadn't created the art or the writing in the first place, AI wouldn't have anything to eat so that it could spew out its plagiarised regurgitations. It's the end of culture. The end of new art styles. It's the end!!
The important thing to understand is that there is no stopping AI. You can no more take AI away than you could take all the guns away from Americans. What we need to do is protect ourselves morally,ethically, and legally from the misuse of AI which is inevitable as it is for any technology.
Another perspective about how AI harms artists: AI is cheap. AI does not have bills to pay, rent to cover, mouths to feed Anyone can twaddle with AI for 10 minutes, make an image and sell 1000 copies for $5 each. This is a clear harm to artists who work for a year to make one image that may not sell
The stock photography site Can Stock just shut down because it cannot afford to compete with the rise of AI imagery - and so thousands of photographers who sold image licenses on that site have one less income stream. AI kills creativity.
Virtually all industries are at risk of losing jobs to AI (mine included). That's a problem of capitalism, not AI. Same thing happened during the industrial revolution and will happen with AI or whatever the next thing is as long as we as a society value profit over people.
Yes. "Training" is one of those words that changes meaning depending on context. "Training a machine" is not the same as "training an artist." Or writer.
The former can learn, the latter write what they know or want to know. Kind of like how the talking mongoose should have been an entirely different movie due to lack of research.
Branding is the real success story of Open AI, Stability, et al. The use of anthropic terms like "training" confuses the hell out of people, and we've even had to change our definition of "AI" (now referred to as gAI, or "general AI").
"AI is a tool" is one of those statements that is so often misused. Yes, this true, but it obscures just what that tool is being used for a lot of the time. The team of Across the Spiderverse using AI to quick scan for rough spots in their own work - legitimate.
Also... if we ever get to the point where a machine is truly capable of art, we should consider whether that machine should now be considered a person with its own rights... heck by that time machines would likely have past that line for a while.
The nuance of your writing comes from everything going back to your parents teaching you to speak, teachers providing basic grammar, higher education, editor's feedback, books you've read, etc. My difficulty with this AI/ML conversation is do you credit every one of these individuals in every book?
Hi, I have a linguistics and software background with some computational linguistics experience. Over the last 30 years natural language processing techniques have moved away from a structural/humanlike/semantic-syntactic approach and towards purely stochastic techniques. 1/?
So you should credit/pay your parents for having sex and creating you? Even before AI no one credited everyone they'd every met. The core of this discussion is about theft. It's about tech bros stealing the artists unique work for ML and trying to profit from it without consent, credit or payment.
Because it references, it doesn't create. It only reflects back to us what we've already and always seen. What happens when the Ouroboros finishes eating itself? When all we receive are references to the self, there will no longer be a self to reference.
LLMs sample human art to then simulate it, but they will make increasingly inhuman art if they sample only LLM art.
See, the thing they can't simulate, and in fact have no capacity for, is judgment: ie, does this look like the thing I'm drawing? More to the point, does this look like shit?
To be fair, most humans rely on other humans to determine what looks good and even then it is in a limited scope. Millions of people love the Mona Lisa, millions more don't get the appeal. Some artists will tape a banana to a wall and sell it for hundreds of thousands of dollars, other people balk.
Humans can intuitively tell when a human face looks like a human face. An LLM has no idea how many legs a human has bc an LLM has no consciousness. They don't produce art by the same process that we do. They can create fairly convincing simulacra of what we create, but they do not possess judgment.
You misunderstand what inspired means. It's not absorbing images and copying their style. It involves more than that. It's about an artist's psyche feeling the images/words/music, etc. and having a visceral, emotional reaction to them. It's about the subconscious response and creating from that.
"AI" doesn't exist yet. the process you're describing is humans very intentionally putting other humans' work into a model to extrapolate similar output for personal/financial gain. this isn't artistic inspiration, and claiming it is is intentional obfuscation or ignorance of the process
Most artists are inspired in such a way to find their own voice, their own style. AI retains data that it's fed in order to regurgitate a cannibalised bastardisation of that data. It's not the same thing.
Many works are inspired by a feeling, event, & simply pour out of us without thought or contrived style. They are images, music & words that force their way out in their own style uncontrolled even by the hands/minds from which they escape. We search for patterns after the fact & name them "style".
Artists who came before us are inspiring. They honed their skills with dedication & passion, over years. We learn how to be better, more observant artists by understanding their processes. Art is a list of tiny choices to help create unique works,not copies. AI isn't a tool, it's a plagiarism app.
As someone who writes and who works with AI, this line of thinking is a profound misunderstanding of what large language models do. For example,. it imagines they understand things.
AI is a gift to grifters and scammers - used by people who possess zero talent, can't be bothered to learn how to draw or write, yet want kudos/income for 'making art'. It's being used by people who want to make a quick buck by uploading Shite/GPT ebooks to the kindle store - so yes, lots of evil!
OK, I can't believe I'm having to say this but the difference is that a human is involved.
If I read something and I'm touched by it emotionally or impressed by the craft that went into it, it's possible that some of that will feed into later things I create because I've learned how to do it.
But that's still me, a human, doing it. Art exists in a context but it's a context of humans addressing humans, being inspired by humans, having rivalries with humans. People are communicating to people and understanding each other.
There is no understanding in AI. There is replication, variation, and approximation. There's a string of numbers that indicates that output with parameters xyz will be considered to be "in the style of abc". Nobody learned anything, nobody connected to anyone.
Your making the point against yourself here. What your talking about is the difference between influence and sampling. We don't make people give credit for inspiration, but we do if a work is being sampled. To keep with your music analogy.
Copyright as a concept was invented by humans for the sake of humans. Humans decided that human creativity deserved certain rewards and protections. Plagiarism is also a human invention; can't just let anyone blindly copy an actual creator. Who's the human deserving protection in an AI piece of art?
The one who is more famous. One can be called Walt Disney and have his name immortalized thanks to Grimm, Andersen and others. One can remake the corner shop as you've got mail and never credit where it came from.
I believe we're arguing the same point. It's @curiouslycory.com who is suggesting AI should be treated no different than human authors; I'm suggesting the humans should call bulls*** on that idea..
You don’t get to call it fair use if you use it to compete with the original creator. That’s a central plank of fair use.
And also, there are non-economic parts of copyright, too, moral rights that cannot be assigned, often not even waived. Attribution of authorship is a key part of that.
Because musicians that play other people’s music already pay royalties. Learning how to play in the style of another and the modifiying it to build up yours is not what a ML algorithm does. Stop equating both.
I’m curious what your thoughts are as to a workable solution for AI. They’re profoundly useful tools in some respects, but the ethical concerns are being addressed much more slowly than the technical obstacles.
The first stuff to go will be department store art like in IKEA. That will all be machine made within a decade. Artists will need to become more akin to conductors of an orchestra, guiding the development of expression. BTW this isn’t even AI, it’s advanced statistical modelling ultimately.
This reminds me of a book fair where I had a table. A teen boy came up to me, picked up one of my books and asked why I bothered to write it myself if I could rather just get AI to write it for me...
Hi Neil, I don't know how to say this without it sounding accusatory, when it's meant to be a gentle request, but it would be helpful to include the text in the alt text if you want people to read it.
Wish Ai could duplicate the quiet contemplative feeling l get while drawing. Then maybe its users could shut up just long enough for some much needed self-reflection.
Comments
Photoshop also didn't replace artists. There will always be new and good ideas and this resource is endless.
Well, it sucks. But the term captures the process in a lovely way.
😆
AI "artists" do the same, they keep generating dozens and hundreds of images until one of them, by pure luck, comes out looking alright.
Meanwhile, the AI will always be hit or miss, no matter how much you train it.
.
AI is spurring new scrutiny, for to me AI is just another tool.
Instead in just copies/steals from what it's been given as input, and approximates based on that instruction set.
To my knowledge, this hasn't happened with AI training... the opposite, in fact.
This is separate to ensuring artists are paid for their materials that are used to train AI.
When I was 16 & read it, that seemed the most "never could be in the real world" thing.
If real people hadn't created the art or the writing in the first place, AI wouldn't have anything to eat so that it could spew out its plagiarised regurgitations. It's the end of culture. The end of new art styles. It's the end!!
Ripping off copyrighted work - not.
AI just takes, it isn’t in conversation with, commenting on or critiquing what it steals. Maybe in the future it will but I haven’t seen it yet.
I don't know the answer. but maybe all AI generated content should have a signature that embeds credits for everyone involved like a movie has?
It does not really create, it just grabs and predicts what people *expect* thru statistics
Therefore it also has no concept of truth - expected is not always 'true'
A neural network, fed with enough data, will just make a decision tree for what usually is made, which is why there often is *blandness* to it
See, the thing they can't simulate, and in fact have no capacity for, is judgment: ie, does this look like the thing I'm drawing? More to the point, does this look like shit?
AI just takes, it isn’t in conversation with, commenting on or critiquing what it steals. Maybe in the future it will but haven’t seen it yet.
If I read something and I'm touched by it emotionally or impressed by the craft that went into it, it's possible that some of that will feed into later things I create because I've learned how to do it.
And also, there are non-economic parts of copyright, too, moral rights that cannot be assigned, often not even waived. Attribution of authorship is a key part of that.
Maybe it’s jumping at shadows, but apparently some publishers refuse to add any clauses about not submitting someone’s work to AI, so…
Thanks