dear machine learners,
“hallucination” are not “incorrect facts”. try “incorrect statements”, “lies” or “made up bs” instead. the very idea of an “incorrect fact” is an oxymoron and intellectually deceptive
thank you!
“hallucination” are not “incorrect facts”. try “incorrect statements”, “lies” or “made up bs” instead. the very idea of an “incorrect fact” is an oxymoron and intellectually deceptive
thank you!
Comments
Fast Facts: https://journals.sagepub.com/doi/full/10.1177/20563051231195546
I got translations on top of translations on top of translations homie
what the fk could go wrong?
<3
The manufacturers have put it out there like it's a finished product... but in software terms, the AI/LLM systems are barely alphas.
They're going to be responsible for a lot of wrongness.
We're it me I'd call them Cliff Claven's. It just made up some shit that seemed reasonable enough if you were drinking at a bar.
I'm uneasy about these terms implying intent & choice, still — a person tells you truth or lies / bs (or hallucinates... at least that's lower awareness).
An LLM tells you... an auto-remixed new thing from its human-written training data.
But that phrasing isn't sticky.
does autocomplete hallucinate? no.
and neither do LLMs. they get the completion wrong. that’s it.
We have an article coming out on the term hallucination in LLM documentation soon, one of the claims is that the term itself is made to relieve the creators from the moral burden and responsibility of spreading lies, because the term hallucination is not morally loaded, it "happens unwillingly"
A closer term if we want to stick to psychiatric language would be confabulation. Dementia patients confabulate: they tell stories, and even believe them. But they're just saying things. It's not a lie, nor false.
Hallucinations lack that morsel of awareness.
Ideas which are alternative to facts are deceptions.
Bullshit is not exactly a lie, a deception. It requires no knowledge of the truth. It serves some purpose other than truth. "Bullshit is a greater enemy of the truth than lies are."
I blame Kellyanne Conway's "alternative facts."
"Translations" were offered & "corrected" by everyone & anyone with no qualification & the most "popular" became the default response by GT, irrespective of the TRUE translation.
There is nothing whatsoever that is "intelligent" about " #ArtificialIntelligence".