Yes AI art is ugly, but that isn't the problem with AI art. AI art will get better and it will still be unacceptable.
AI art is built on stolen art. AI music is built on stolen music. AI videos are built on stolen videos. AI text is built on stolen text.
Across all mediums, fuck AI.
AI art is built on stolen art. AI music is built on stolen music. AI videos are built on stolen videos. AI text is built on stolen text.
Across all mediums, fuck AI.
Comments
We have to ask ourselves 'if I bring this into the world, how can the rich exploit it against us?' for everything we make.
Because pointing out the theft at its heart isn't even about its uses, it's about its source.
AI 'genning' is by nature a weapon against the working class.
i'm not a lawyer, shit's complicated, i hate it, ¯\_(ツ)_/¯
They also tried, and failed, to get to ignore all copyrights on texts as "they need it for their AI to work" in the UK iirc
and tech what AI does
AFAIK, what AI does not have (will it ever?) is the personal component, allowing a feeling from one sense to express and communicate a feeling using a different sense.
And even then, if your ML program is trained on stolen data, fuck it all the same.
AI does not exist without stolen data. The technology itself is unethical.
https://bsky.app/profile/drisaac.bsky.social/post/3lazb3d4xlc2j
Or at least in my own circles, it just fills in the role that a lazy five minutes of google image search would have in the past.
We need *both* 1) regulation against stealing data and 2) reliable methods to procure evidence that stolen data has been used.
Sometimes humans add nothing. "Sometimes humans steal manually, so automated theft at mass scale is okay" is wild.
I've been a pro photographer, eaten and paid rent based off money from sold pictures, and I don't really see how things like magic eraser stole my work even tho I know it's in there via the getty license.
Until LLMs are demonstrated to be categorically equivalent to the human creative process, there is no reason to believe that to be the case.
(but "AI" is a loaded term these days, so it's hard to know if what you're talking about was also built upon stolen data)
Even if people consent to their creations it it’s a complicated topic. It’s just unfortunate most ai is used and trained to steal from creative’s mostly art, voice actors and music from what I’ve seen
Investor confidence in "AI" is tapering off
https://www.google.com/amp/s/www.marketwatch.com/amp/story/adobes-stock-drop-highlights-this-fundamental-disconnect-over-ai-b94128be
AI is awful as are the people who push it for everything.
But shit that needs detailed eyes like cleaning jobs or creative work, or graphic design? No... absolutely not
Why pay an artist to make something that could take weeks when you could get it now for free?
Pre-vis? Proof Of Concept? Those are okay. "Here's The Finished Product, Copyright Me?" OPPOSITE of okay.
It's a fun toy and a useful tool, but don't buy the "Wave Of The Future" BS.
AI art-generators are based on publically-available art-assets posted on the Internet. Am I supposed to cut an artist a royalty-check now, every time I so much...
Public Diffusion is the first large-scale attempt, that is currently trained only on Public Domain images.
This solves the built on stolen art issue and is a step towards more ethical AI across all types of media, showing that it's possible
https://x.com/JordanCMeyer/status/1866222906402886121
Took my kids to see FLOW yesterday. AI will never be able to do something like that.
So far most of Code AI thats helpfull is that it just runs the code and sees where things go wrong?
With all the other stuff it just steals and gets away with it. They dont care.
Still, AI SUCKS so much.