Or they ended up with more responsibility than they could handle alone, being driven by ego, demanding bosses and/or other commitments, and without much recognition for the extra output either, and now they're stuck.
Yeah, that sounds like faking it and holding on until something exposed them. The house of cards is always going to come down eventually. Especially when you actually need is a second person to help you out.
Well yes, but it's not necessarily that everyone was faking skills, more that some may have taken on more than they could cope with without the crutch. Capitalism encourages that of workers and workers are supposed to feel good about being "good". Bosses then take increased output for granted.
Some people (usually dudes) routinely do half the work they know they could do, and get away with it, and know that if they ever did more, they'd always be expected to match it. Job roles can sometimes be surprisingly fuzzy with what level of output is considered standard and reasonable.
I guess one of those advertisements that PR firms get published on news websites. Probably targets creating FOMO in managers who will bestow the glory AI on all of us. Personally, using ChatGPT for any useful coding was extremely exhausting.
There's a certain smugness (I will only slightly contain) that grows from having refused to rely on these tools while retaining the (apparently rare?) ability to do my actual job.
I don’t think people understand just of much these things are already used by office workers across the globe. No moral argument in the world will change this, it’s all up to regulation etc.
SO's HR department is literally sending out emails written by ChatGPT now. It's fucking wild, especially because it writes the emails in the most transparently AI way possible.
IDK if its morals of convenience but I do think writing code with an LLM seems ok if the inputs, outputs, code style is in line with what you need to do, but I will never see people using them for writing essays as anything other than cheating. Essays supposedly evidence fluency of understanding.
So what you're maybe saying is "for work is okay but for education is not", because I used to have to mark students' code and try to spot the cheaters who copied off others, and ChatGPT will not aid the requisite basic understanding either.
I’m sorry but this is wrong. LLMs are terrible at writing code because they cannot think or use logic beyond “what is the next most probable set of characters”. I write code for a living and I would never use an LLM for it and it alarms me that some of my colleagues do.
I also don’t use LLMs for writing prose because it doesn’t do so well at that either. I just don’t use them because they aren’t useful for anything in my day to day life.
if you're qualified to assess the functionality of AI-generated code, you're qualified to write it yourself.
i've actually seen a lot of coders drop AI-assist tools after realizing it made them lazier, slower, worse coders whenever they had to write something themselves.
Code from an LLM is never going to be holistic, and unless it's just a template, it may - in a practical sense - be even worse than using it for an essay, because instead of *just* eschewing your own written knowledge, you're using it to create a practical tool that you don't/won't know how to use.
LLMs are OKish at dealing with boilerplate, and at "solving" solved problems for when you don't want to download an entire library for one small function. They are terrible at anything that professional programmers are paid to do.
This. I turn on a tool occasionally when I'm writing docstrings in my functions--the most boilerplate stuff I do--but most of my code is scientific work, figuring out how to best solve complicated problems dealing with remote sensing data in which I have expertise, & it's all-but worthless for that.
Bingo. Using it to template out an empty class or page? Fine - that's boilerplate. But using it to actually write functions? You're heading for a bloated app with no reusable functions/objects and no holistic vision.
Comments
Fuck it, at some point the bots are going to be better company than the humans.
https://en.m.wikipedia.org/wiki/Icarus
Just a big rack-mounted server refusing to get out of bed
i've actually seen a lot of coders drop AI-assist tools after realizing it made them lazier, slower, worse coders whenever they had to write something themselves.