As someone who spends a decent amount of my professional life working with AI tools, using an LLM to decide if someone's work is "mission critical" is such a dumb fucking idea that whoever thought of it should be shot into the sun to preserve the human gene pool.
Reposted from Max Kennerly
Even accepting LLMs are capable of this task (lol, no), this is set up to fail, to create excuses for arbitrarily firing people.

Nobody was told that was the query; you can't tell "mission-critical" from five bullet points; a snapshot of one week won't tell you that either; the list goes on and on.

Comments