I recently ran to an error and wasn’t sure how to configure the a server to fix it. When I googled, the top response was an AI summary of my problem that presented two options. It flipped the solution and the mistake and helped me solve my error by doing the opposite of what it said.
It’s a problem in the working world. Lots of “self-taught” coders who just pull AI code interview, and unless they get an ACTUAL coder interviewing/testing them, they get in and then put things at risk because they’re ignorant of how their code works or even the fundamental solution it represents.
The major problem everybody's missing is it AI is a terrible coder. The proper use is for help when you're stuck. Anything it codes needs to be proven.
This is just the next extension of how "kids these days" just use mobile apps and have no idea what a file system is or "how to use a mouse" or whatever it was a couple years ago.
The ones who have a genuine interest in how things work under the hood are the ones we want doing the jobs anyway.
I actually think there is some good in letting AI (or ML, as is probably more the case) do some of the work of programming... but they really should know how the programs work or they're going to get into deep doodoo eventually.
This is a valid use case but I truly worry about the future of the profession. Every senior engineer was a junior at one point, and it’s the hands-on problem solving that got us here. But it feels like corporations are trying to phase out the early career folks.
I'm an old coder (17 years professionally) and this is exactly why I've been reluctant to embrace AI for coding. It has been a boon for rapid prototyping or going through a bunch of code and producing documentation but I still have to manually check stuff because AI is still imperfect!
Don't forget the classic Silicon Valley rug pull. They'll sell the LLMs cheap to get you hooked. Once you're hopelessly dependent, they'll jack up the price until you're paying half your salary to them to maintain access.
“It just floats, I don’t know how though. But it works, doesn’t it? Now, we need to get going faster, so I’ve got to go take this lit match into hull, according to the AI Engineer.”
When I worked in IT (15 years ago!) we called these kinds of coders "code monkeys". They could do the few tricks they had learned, but had no clue how to do the THINKING part where you figure out how to translate 'what it should do' into IF/THEN/ELSE structures (etc.). New grads were the worst.
Old enough to remember when a generation of COBOL programmers were coaxed out of retirement to work as contractors on Y2K solutions. They all made a small fortune.
I sense a rehash of this when it comes time to fix some of this shit in a decade or 2.
I’m figuring sooner or later Elon’s gonna get around to firing &/or pissing off the one guy who knows how to work with that one piece of the govt’s systems that literally everything rests on and that’s so old the manuals have been out of print since the last time Elon could pass a surprise piss test
Also, that's basically my retirement plan. Because it's the only way I'll be able to afford to retire, when they ask me to come and fix all the JavaScript.
For writing tasks, there's a very real danger that younger folks rely on AI for their research. The problem with that is AI cannot discern truth from fiction, it just goes by a preponderance of training data. If most of it's training data says the world is flat, it will tell you the world is flat.
Anecdotally a decade a go a guy I knew in college told us all they lost Internet in the office and nobody could finish anything without stack overflow.
How much of this is HR complaining people can't to their weird take home coding puzzles?
I'm old enough to remember 1,000 page "Windows SDK Bible" type books, as well as the several-foot-long mini/mainframe documents on the racks by the line printers.... if only we'd had Stack Overflow back then!
An MS or Ph.D. in CS will be more meaningful soon. Those with a BS (hah!) will be code monkeys that do the grunt work while those with higher degrees will understand coding. This is a lot like some other disciplines, particularly in the sciences. I offer no value judgment here, just a prediction.
Twenty years from now we'll be creating things we don't understand and cannot fix or recreate on our own. It'll essentially be magic to those using it.
Generative AI definitely appeals most to the get-rich-quick cohort. What they fail to realize that if they can do it... so can anyone else, thus rendering the work valueless.
Exactly. Why would I ever hire anyone to code, write, or create art using AI? If I want that type of product, I can type my own prompts into the engine. I could use the time it would take to review work submitted to me to instead just generate the work myself.
And that is reflected in the final product. Everything written by AI seems like it was written by a committee of a thousand people - because it was. It's the blind averaging of millions of works. So of course the results have no voice or perspective to them. It's all bland corporate-style crap.
It's not just kids. I'm moonlighting clearing up after a "senior architect" who was asking ChatGPT how to do stuff in the Amazon cloud and just plugging the results into AWS, apparently oblivious to the fact that some of this stuff costs money, in some cases a great deal of it.
As someone who has to deal with this every day:
IT IS NOT FINE.
AND IT HAS NOT BEEN FINE FOR MANY YEARS.
Seriously. The average 'leetcode' programmer has absolutely no concept at all of how computers and networks actually work. They just know what they got told to memorize.
IT and software companies are going to get a bad awakening in a few years, when nobody will know what a certain program (of their own labs) does or how it does it.
Some more years and software engineers we’ll be completely replaced by AI.
We’ll have to learn fast how to live in new “analyst” roles.
Any good sci-fi novels, short stories, etc. written about this kind of, er, convergence over the past couple years? I'd love read what a clever extrapolation of this trend into the future would look like.
To be fair... Go back 5 years + and most programmers googled a hell of a lot of stuff. Stack exchange was their lifeline.
AI has undoubtedly amped this up to new levels of course.
Wonder if universities will drop handwriting code in exams now?
Yeah, it’s getting really, really bad. We often HAVE to use AI because some evangelist up the chain demands it but talk of hallucinations, biases, the complexity of human vs. artificial analyses is poo poo’ed.
I guess you haven't asked a teenager ANYTHING in the last 20 years. Since the video game brain-rot... They've been pretty clueless about real world situations.
No, it isn't.
While we use AI (not ChatGPT) as a productivity enhancer, we treat anything AI suggests as if it comes from an intern who hasn't got a clue about our problem domain. You'd be amazed by the weirdness it comes up with. This will only lead to tears.
At least for basic programming courses, this is where requiring them to diagram out programs, on paper, as a major part of an exam really helps. At least for simple programs, if you can't draw a simple diagram illustrating the logic of your program, you really have no idea what you're doing.
So many programming tools are designed to enable crappy programmers to write code they do not understand. AI is just the most recent in a long list. (this is not new, e.g., computerization of cars enable new features but they also cover up mistakes by crappy mechanics who do not understand cars).
Why learn math when we have calculators? Why learn to read and write when AI can do those for me? My mind is never allowed to be bored, to imagine, to create, to innovate, nor to regulate my emotions. Adults have stolen/thrown away whole generations of childhood development in favor of convenience.
Nah, we're all be dead in what I would call a "Stupid Terminator" scenario. Skynet doesn't gain sentience and decide we all need to die. It just hallucinates and decides that launching all the nukes is what it's being prompted to do.
I love your optimism that things will actually be *fixable*.
I've seen the shit it puts out. The only 'fixing' will be 'rip everything out and start over from zero.'
I learned to not only drive but the actual mechanics of all my machines. I drove a 72 vw bus in the late 80s because I could fix it. Cheaply, regularly…it was a relationship. Shouldn’t we be seeing electrical engineer sexy by now?
Zen and the Art of Motorcycle Maintenance should get more love.
I do love that one of the arguments in the article is that how dare they use AI to learn how to program instead of using Stack Overflow! 10 years ago Stack Overflow was the devil
Instead, we get programmers who have absolutely no clue what they're doing, just that "it works." And I mean absolutely no fucking clue whatsoever.
"JUST USE A MEMORY SAFE LANGUAGE." Okay and how does this interact with DMA?
"You don't need to!" Uh, yes. That's literally how it works.
i spend time (as a nearly 40-year veteran) in a C++ discussion group where we get a lot of new learners and a lot of them are much more clueless and _less willing to accept clue_ than i remember being the case "back in my day"
but maybe this is me being another old woman yelling at clouds
Not just developers. Last year, I got a contract to replace a sysadmin who proudly explained to me how he solved a network problem by replacing a switch with a different brand of switch, as instructed to by ChatGPT.
The actual problem wasn't actually resolved, mind you. But he was proud of himself.
Has AI even been out for a year yet? And what is the sampling data? Did they actually check real programmers, we're just kids online who claim to want to program?
Former college cse instructor here. Current undergrad students majoring in computer science have no idea how computers work either. They cannot explain the difference between primary and secondary memory. They can't tell you the difference between machine language and assembly language.
Honestly, as a software engineer I think this is an overblown lie. The fact of the matter is the statistics are not actually smart enough to code. So those engineers do know what the code is actually doing if they are shipping anything.
Someone is probably just asking them irrelevant questions.
It was sobering enough to discover that my talented Zoomer coworker could barely type. She - who graduated early - typed all her papers on an iPad. She was wicked fast with her thumbs, though....
Older programmers just had to use stack overflow and still had no idea how things worked.
There’s a small lesbian collective somewhere in Omaha who actually codes and has done for 50+ years, everyone else just steals their work via obscure questions or wrong answers on stack overflow.
On a more serious note though… shit is all gonna collapse in 10-15 years because big corps are replacing junior engineers with AI, which means that when the current batch of senior engineers quits there are no more.
Or people who understand the databases, because SQL bots were used for queries.
I use CoPilot at work and I just don't understand, when people write about this stuff as if it would produce good code immediately.
Either I am too stupid to use AI correctly or these articles are just full of shit
Man this makes me sad. I'm not a coder at all, but I've tried my hand at reverse engineering a few code-adjacent things in the past and I got a huge amount of satisfaction when I actually made sense of how things worked so I could make more adjustments....kids these day man.
When I read the Kurzweil book back in the day, I didn't expect the singularity to be an incompetent incel, since that's the dataset it was trained on... 😭
I’m skeptical. He reminds me of some of my old professors from the 90s telling me how computers were “ruining” foundation skills needed in the work world. So far a lot of those skills most employers don’t pay you to know or know how to adequately compensate your knowledge of foundational skills.
I recently gave up on my Java course because I decided I'm too stupid to learn to program now. I should have considered just asking AI to do my work for me. Absolutely nothing negative could possibly come from that.
Aways ask the AI: “why did you solve it that way?”. Consider asking for alternative implementations. Consider asking for clarification. Challenge the AI with counter examples. STAY CURIOUS. You’ll know when you commit if you’ve offloaded your brain, or if you’ve learned something new.
Sort of reminds me of the old Asimov short story, "The Feeling of Power" about a future time when humans no longer understand how to do basic arithmetic.
It's slowly making it's way through schools right now. I teach highschool physics (11th grade) and the amount of time I see incorrect math on simple calculations (meaning order of operations issues) is starting to drive me crazy
Anybody teaching AI about requirements analysis or software testing? No worries, the AI will likely fake the results anyway. This is not how we create a productive society!
I spent the past few years worrying that I was gonna be rendered irrelevant, but I'm starting to my mid-40s basic html / php / css ass is still gonna win in the end
We are the dumbest society. We could have utopia w computers doing our menial tasks and instead I ended up on the timeline where computers are being trained to do things people should be doing for enjoyment
Perfectly phrased. This is exactly it. AI is great for polishing up work email responses, resume keywords, assisting with awkward situation email/text replies, keywords for social media marketing campaigns & pattern recognition related tasks. NOT the work of human creative & analytic brains.
I asked a coder recently why they were hard-coding individuals' email addresses in the alerting program send line rather than using a distribution group address whose content could be changed without requiring code change approval each time a recipient was added/removed...
People always berate the hiring process at software companies, especially devs themselves. This sort of thing is exactly why I test people. A lot of skill of interviewing is now spotting when they are googling/using AI tools.
I honestly don't really understand using AI for this stuff. I'm not a particularly good coder, but writing code that works is so satisfying, why would i outsource to AI?
it's really painful to see the testing setups in place in my current company with no dedicated QA roles/expertise compared to my previous one that had a full fledged team --'
My daughter is a college junior, studying programming and data science. She had a really difficult homework in a seminar last week, where pretty much every student just used AI to solve the problem. She refused, even though she was up all night trying to figure it out.
It's insane. Why would you learn to just be an agent of a machine? If all you do is parrot an LLM's output, why should any company hire you as a programmer? If they just need someone to type into ChatGPT, "write a program to do X," they can hire some high schooler at minimum wage. 1/
And even if humans are needed, why are people so easy to forget the Silicon Valley business model - the rug pull. Their whole model is to first offer a service at a ridiculously low price. Then get everyone dependent. Then jack the price up. 2/
If AI actually does work, coders reliant on AI will eventually find themselves paying half their salaries to OpenAI to keep access to their tools. If you're dependent on these big companies, they have you over a barrel and can charge whatever they want. 3/
This work will pay of in the future. If you use AI for college problems you will never solve new problems. And surprise surprise the real world problems solutions aren't on StackOverflow or github already and so they aren't in any AI.
I'm curious to hear how the other students did. She told me she saw how some were using AI, and she thought they were posing the question incorrectly. It will be very sweet if she got a better grade than they did!
She was afraid she'd gotten a bad grade on the assignment, but found out today she got a 90. I am so proud of her for resisting. And I'd be just as proud if she got the 60 she expected to get. But it scares me how these students are not learning how to think and solve problems on their own
I think people are dividing themselves into different classes — those who can think for themselves, and those who cannot. Thinking for yourself is a skill that must be learned and practiced. Students who bypass learning this skill will be at a disadvantage.
I feel bad for the students who haven’t been told this, or who haven’t taken the lesson to heart. They are working towards careers in which they are unprepared to succeed.
As a teacher in the IT faculty at my university, I've noticed the average marks drop by about a grade since ChatGPT became widely accessible. Students just shove the tasks into it and uncritically accept what it spits out, even when it's incomplete or wrong.
I'm gonna take a slightly contrarian take, here, as an old nerd who's been writing code for 25+ years: we're tool using animals. Most of us can't explain how our cars, dishwashers, or vacuum cleaners work, but we use them to good effect. Heinlein thought everyone should be able to reason ... (cont.)
... everything out from first principles, but the further up the tech tree we climb, progress will depend on not loading our limited brains with solved problems. (that said, yes, the danger in current-gen LLM coding is that the person overseeing it won't be able to spot or fix significant problems.)
I don't know how to build a car. But someone, somewhere does. That is the crucial difference here. AI as currently implemented is a long-term existential threat to civilization. LLMs can't generate anything new; they can only predict outputs based on averaging vast amount of training data. 1/
If you train an AI to design a car, it will just look at a whole bunch of cars, and try to create a design that resembles existing cars as closely as possible. However, the next generation of AI will also be trained on the data set of existing car models, many created by AI. 2/
In time you get AI decay. AIs copying and copying off each other as less and less things are actually designed by human beings. I don't personally need to know how to design a car. But if no one on Earth knows how to design a car, and we only have some AIs guessing at it, we are royally screwed. 3/3
Thanks for mentioning writing. I offered to give a painting group tips on writing about their work—surely something that requires the most personal, hands-on treatment I can imagine—and the reply was “Why? AI can do it for us”
Cautious use of LLMs can definitely help in writing though
I find it quite useful when I need to unstuck myself
Say, a wording is off and needs fixing -- generating some options helps reset the mind and write something better -- often quite different from what AI suggested
All you need to start with a thesaurus is the word you have that isn’t quite right. Look up that word and the thesaurus will tell you words the mean the name, words that mean the opposite, and words that aren’t quite the same or the opposite but are related.
Every server I'm in, there's always someone in a coding class who complains that their professor REQUIRES them to use ChatGPT to write something as part of an assignment. This is truly one of the worst timelines.
It's worse than that. Young people tend to trust, and as a result they believe everything/anything the AI says. And they have no experience to even know when they're being lied to. Worse than worse, even the AI rarely knows when it's lying to you. :|
An LLM can be useful for coding, but you have to treat them like an eager, but half-bright newbie, like an intern straight out of a for-profit coding boot camp
This is sad and disappointing but not surprising. Have seen young fandom filled with people who don't know how to navigate websites without an "algorithm" to feed them content they like
My brother is already seeing this with his students. He tells them that they are only cheating themselves because they won’t learn what they need to know.
One of the first things I tell my dev students is to not use AI to learn how to code. You don’t even know enough yet to understand why what it’s giving you is wrong.
The ST:TNG episode Wesley And The Kidnapped Kids touched on this, too. The desperate aliens gave the artist kid a magic wand he could wave over wood and automatically sculpt his desired image, but the kid didn't know how to cut/carve/sand/polish his medium.
Amusingly enough, I was experiencing this in the 2010s with random libraries from the internet being incorporated into critical applications. One alteration by the Free Software Enthusiast author could Break Important Stuff. Even real s/w engineers chose convenience over knowing how it worked.
And I do technical dev interviews. Last year some candidates were using AIs to answer questions even on live webcams. Eye movements showing them reading, inability to answer follow up questions, delays waiting for the AI to display results they would verbally skim & read back. Depressing.
There are high school and entry-level college courses on how mice and keyboards work. People laugh about kids not knowing how to sign their names in cursive but some can barely even physically write it in block letters. The younger generations are just never taught these skills.
Coding, hell, I've been in classrooms to teach coding and the students couldn't write - they'd just use voice recognition the whole time, and absolutely painfully struggled to write basic sentences by typing/writing.
And as someone who doesn’t code but works in tech and asks ChatGPT to “write me an example API” the one thing it is good at is explaining in line what it did.
This was the primary use case for ChatGPT in the first place.
I hear what the article is saying, but does having AI generate a block of code necessarily preclude understanding how the code works, or what the edge cases are? Maybe it does.
I, for one, want to assure anyone and everyone can access the arcane virtual inscriptions that fuse reality together, and that they are spread far beyond the reach of the Necromancers' withering influence.
Comments
The ones who have a genuine interest in how things work under the hood are the ones we want doing the jobs anyway.
—Chief Hindenburg Engineer Dale
I sense a rehash of this when it comes time to fix some of this shit in a decade or 2.
https://en.wikipedia.org/wiki/Year_2038_problem
This is not new at all, lol.
How much of this is HR complaining people can't to their weird take home coding puzzles?
GUESS WHO WAS SUPPOSED TO TEACH THEM?
True, but that understates it. "Later" can mean years from now, but I'm thinking more like "later this year" or "later today."
Honestly, think about them DOGE boys mucking around in our systems right NOW. We're paying.
IT IS NOT FINE.
AND IT HAS NOT BEEN FINE FOR MANY YEARS.
Seriously. The average 'leetcode' programmer has absolutely no concept at all of how computers and networks actually work. They just know what they got told to memorize.
Some more years and software engineers we’ll be completely replaced by AI.
We’ll have to learn fast how to live in new “analyst” roles.
AI has undoubtedly amped this up to new levels of course.
Wonder if universities will drop handwriting code in exams now?
Elon the scary father https://bsky.app/profile/out5p0ken.bsky.social/post/3lixekuw7xk27. @out5p0ken.bsky.social
While we use AI (not ChatGPT) as a productivity enhancer, we treat anything AI suggests as if it comes from an intern who hasn't got a clue about our problem domain. You'd be amazed by the weirdness it comes up with. This will only lead to tears.
Seems more like a misguided pointy haired boss problem than a young coders problem.
2025: you should unlearn to code.
I've seen the shit it puts out. The only 'fixing' will be 'rip everything out and start over from zero.'
Zen and the Art of Motorcycle Maintenance should get more love.
On the other hand, I have been watching my peers say since the late 90s "we'll never have more great programmers because
I remain unconvinced that the kids are all doomed.
"... because nothing you could make yourself could possibly compare to the quality of modern games like Quake" (no)
"... because they'll just learn to copy&paste from StackOverflow and no one will struggle and learn anything." (no)
Which is to say, all these things did happen, but also none of them caused us to run out of programmers.
"JUST USE A MEMORY SAFE LANGUAGE." Okay and how does this interact with DMA?
"You don't need to!" Uh, yes. That's literally how it works.
but maybe this is me being another old woman yelling at clouds
OTOH, it sounds a bit like my elders in my day complaining that the c compiler was ruining us because kids my age didn't know assembly.
It's somewhere in between. Code still needs to work to make money, (usually)
I don't use AI intentionally at all, though.
The actual problem wasn't actually resolved, mind you. But he was proud of himself.
Someone is probably just asking them irrelevant questions.
There’s a small lesbian collective somewhere in Omaha who actually codes and has done for 50+ years, everyone else just steals their work via obscure questions or wrong answers on stack overflow.
Or people who understand the databases, because SQL bots were used for queries.
Either I am too stupid to use AI correctly or these articles are just full of shit
Signed,
An unemployed tester
"Devs can test their own code"
Signed, my former boss.
Most of them never got a decent job after college and were mystified as to why they could not get hired with their shiny degree alone...
Grades/answers are not the same as reason and understanding.
This is the same so problem with people (including some universities) encouraging the use of A.I. in writing.
We learn to write *by writing*.
I find it quite useful when I need to unstuck myself
Say, a wording is off and needs fixing -- generating some options helps reset the mind and write something better -- often quite different from what AI suggested
But it seems like what you need is a thesaurus, which you can find both in book and website form.
Because this is how we get to become the Eloi.
Probably.
https://en.wikipedia.org/wiki/The_Feeling_of_Power
Does no one read science fiction anymore?
Smash phones with hammers.
Until I see some evidence I am going to treat this like three anecdotes in a trench coat.
Some of what he talks about sounds like low code construction too, not really AI.
This was the primary use case for ChatGPT in the first place.
I hear what the article is saying, but does having AI generate a block of code necessarily preclude understanding how the code works, or what the edge cases are? Maybe it does.
I fear the near future will bring technology worship without any understanding or access.
"No one was ever fired for following BigAI." And there was never another innovation.