The other is that I think some people are underestimating the problem of the uncanny valley. I think we bounce off things that don't feel real, even if we can't identify exactly *why* they don't feel real.
Comments
Log in with your Bluesky account to leave a comment
I’ve noticed this recently with adverts using synthetic voiceover. It does sound very natural and realistic now, but there’s just *something* missing in the delivery
We don't like it, but another 2020s problem is that it's *really bloody hard* now to find the not-Uncanny version as written by a better journalist, so... we put up with the crap version. To some extent "tolerating Reach" and "tolerating Dead Internet Theory, LLM Edition" are *the same problem*
I agree! I’m currently drafting a blog post on the potential for AI in Transport. I think it will end up generating a premium on human interaction and human creativity, because lots of boring routine (and therefore unpleasant) stuff will be done by machines.
Interestingly, writing the blog post, I said in it something along the lines of “If I told you this article was written by AI, you would be less keen to read it, wouldn’t you“. Is that rational? Dunno, but I think it is true.
That is not to say that I don’t think it will be important. I can create ideas and assemble my thoughts far more effectively with someone who is always willing to discuss with me and react.
“Your job won’t be taken by AI, it will be taken by someone using AI“ is the mantra for many roles, I think
I think it's entirely rational! If it's a way of quickly communicating facts, okay, sure, AI will probably be able to do that. But if it's also meant to have the slightest element of opinion... well, who cares about the opinion of a robot? It doesn't have one, it's just some plausible words
Sorry for the long contribution. I think that the value of ai is largely being judged by humanities graduates on the strength of LLMs ability to mimic what humanities graduates do. This is not what they do. I use copilot in particular in my day to day job. What do I do?
I'm a data analyst who writes a lot of SQL, DAX and Python. Is it perfect? No. Does it save me hours every month. Yes. If you have the expertise to evaluate the output it puts your productivity on steroids. Should you let it drive? Fuck no.
Have I manually formatted meeting notes and project specifications in recent memory. No, I'm not a masochist. Do I check the output? Of course, I'm not stupid. How often do I need to correct the output? Rarely. Writers will be fine. Musicians will be fine.
I think part of the hysteria right now is a function of the loss of trust in the tech industry. If this development had occurred right after the Arab spring when tecno optimism was the prevailing view there would be less hand ringing.
AI is neither the answer to all our prayers or the great Satan. It's a tool which is very useful in some contexts and really bad in others. As a musician I'm not worried. The hardest job an artist has is getting people to pay attention. Humans will always win that contest.
I don't mean to dismiss such a long and thoughtful contribution, but I think you've missed my point - I'm not saying AI can't be useful or won't change anything, I don't believe that. I just don't think it's going to undermine creative work.
Oh I completely agree. As a musician the hardest job I've ever had is to convince anyone to put a cd on and press play. So what if a computer is churning out endless pieces of music. I can't see anyone ever pressing play.
So I kind of think an AI generated version of my work would be both offputting and pointless*. And at the point someone thinks there's still money to be made by doing that, I'm ... probably still doing alright? I dunno.
*which would, to be fair, make it not unlike myself
I can see a kind of positive scenario where the internet gets flooded with shitty ad rev raising copy so you can’t find anything useful any more (already happening) so you’re less likely to use search engines to discover and more likely to actively curate, which should be good for creators. Maybe?
Had a really, really interesting conversation with a very experienced graphic designer friend last month and it was interesting because he has done graphic design for many decades and has, therefore, been through similar stuff with, say, computers coming in.
(Also because he's both good at his job and experienced in working out how to use new tools, he was very good at explaining how and why it did make a difference for him to use it and *crucially* how much actual work he was still doing to make things actually look good)
A really interesting example was a project where he needed a diverse range of Australian first nations people, in correct dress, for a brochure and it is extremely hard to find and adequate range of stock photos.
(And you are possibly violating a cultural taboo with your target audience by using stock photos because they do not know if the subject has recently died or not)
My friend was able to use AI to do this but only because he's worked in graphic design for years, knows a lot of indigenous people and knew exactly what he wanted and how to use the correct prompt. It also meant he could focus on other aspects of a tricky (and important) design project
This is not to say that AI isn't going to cause problems, I might it might really fuck stuff up. I just don't think the sort of stuff I do is one of them. If it continues poisoning the information environment, it might even lead to resurgence in paid media by creating a trust premium?
It'll create an information divide - those who can afford to pay for quality publications/websites and those who can't and have to rely on the unreliability of everything else. But there will be a demand for people like you, to those that can afford it.
I think it absolutely will do this. People will pay more, and more often, to avoid it entirely. There will be a "free" ocean of nightmarish nonsense online and then folks will have to pay to get away from that to some sense of information sanity.
I think it's an environmental and supply chain issue. Writers working currently (like you and me) have places to go to sell stuff, and they will still probably be places that want pour stuff. But the low paid, learn your craft jobs that bring up the next generation are going.
Yeah, I think most people want to read things written by other humans. We seek human connection via writing and other creative mediums. If that’s not there the experience just feels empty.
I think your take is a good one.
I actually think coders are more at risk from AI because code has a ‘right’ answer that super-autocomplete can get, whereas the creative professions don’t. Audiences for creativity take joy in insight, surprise, wit - stuff AI is terrible at.
Just for Steve (Jonn can ignore this as he's a bit tetchy today): It's a common pattern that ppl who are good at sth know AI is crap in their field, but assume it's better at sth they know less about. You're a writer & know it's crap at writing. I'm a programmer & know it's crap at programming too.🫤
Ha! Jokes getting crossed in the wires, don’t worry.
I think your point is good though.
And a bit like the news, ironically. People think the news is really solid and accurate until it reports on their field of specialism and then they’re shouting at the telly!
I'm a software developer & I'm not convinced LLMs are good at coding either. Problem is that managers don't care: they're under pressure from CEOs to embrace AI & cut staff now. Nobody's thinking ahead to how they fix broken code nobody understands with no devs. Short-term "win" but long-term loss.
The bigger threat is combining AI with a world that has been persuaded that excellent stuff is usually cheap or free. So the moment we all start charging for our services, saying “this is what journalism, writing, music, film, art actually costs” people won’t pay.
It’s been a drum I’ve been banging for a while that we are in fact already approaching that even if we’re not there already. People piss the bed at being expected to *have ads visible* in exchange for reading something.
I never minded adverts on news articles until they moved from side bars to take up 75% of my phone screen when trying to read the article plus adverts that produce dialogue menus that try to install malware because adverts aren't vetted.
I get a genuine kick out of the moments AdBlocker updates and asks if I'd like to support its free product, and I get to click "no" because if it's going to undermine revenues for MY industry, then why should it get any money EITHER
I think the bigger risk is that mass use of automation for the low grade version of a thing reduces demand for the manual version of that thing such that the training pool of people learning that skill dries up.
AI in a world where it is properly costed, according to data usage, copyright scraping, environmental damage - so that an AI film costs the same as human made art? That might be a start…
I do think that's a problem, I just don't think the problem there is AI, it's training is all to expect media to be cheap or free. "Well, you can have cheap or free, but it will suck!" is not necessarily a bad thing for some creative professions.
The issue isn't necessarily for those with established careers. AI will take up the entry level jobs with higher levels of admin involved. 'Paying your dues' will be less of a career entry point and social mobility will ossify further. Impact will fall asymmetrically on those with less capital
I think this is very unevenly distributed. It's not a problem for people creating high quality work whose consumers/bosses recognise and want the quality. It's a big problem for people who create quality but where the consumer/boss is happy with dross.
I think that is true, the death of ad supported text based means that I'm paying for substacks/patreons that 10 years ago I'd never have considered spending money on.
I was just reading a substack where I began to get the impression that I was reading an AI generated article under a human byline. Needless to say, I quickly stopped.
Overall you’re correct but a % of people who read can’t tell the difference between what’s good and what’s not AND a % of writing livelihoods esp entry level are in producing filler/bad writing. So it’ll have the impact of reducing the number of writers even further & make it even more aristocratic.
But yes, I now read so little (despite reading more than most) that I make a point of checking who has written what before I bother: I don’t read most of the new statesmen anyway & replacing the staff writers with chatbots would make me EVEN less likely to read. End
PS the real impact of AI beyond specialist workplaces is going to be entertainment/culture/media & it’s going to be the rise of new forms of entertainment, mostly voice not text based.
Like, I've been very conscious of this for quite a while now.
It helps that I've always specialised in the long form interview, and long ago worked out the reason I like this format is because it gets to the grit of life, the texture, the underlying rhythms of conversation, the implications that
only become apparent through repetition, the pace, etc etc etc - all things that can't be divined by summation, but only become apparent from the conversation as a whole.
Comments
I think that, at some point, it will be much harder to prove facts and truths, because so much of the text will be AI generated.
“Your job won’t be taken by AI, it will be taken by someone using AI“ is the mantra for many roles, I think
And they're very good ones!
*which would, to be fair, make it not unlike myself
OR
you’re not pointless.
(Not both, obvs…)
It'll create an information divide - those who can afford to pay for quality publications/websites and those who can't and have to rely on the unreliability of everything else. But there will be a demand for people like you, to those that can afford it.
I actually think coders are more at risk from AI because code has a ‘right’ answer that super-autocomplete can get, whereas the creative professions don’t. Audiences for creativity take joy in insight, surprise, wit - stuff AI is terrible at.
I think your point is good though.
And a bit like the news, ironically. People think the news is really solid and accurate until it reports on their field of specialism and then they’re shouting at the telly!
It helps that I've always specialised in the long form interview, and long ago worked out the reason I like this format is because it gets to the grit of life, the texture, the underlying rhythms of conversation, the implications that