Yes, "glitch" suggests that the system is designed to work a certain way and something is stopping it, but making up with predictions that synthesize patterns from the training data is just what LLMs do.
Yes, at least a feature in the sense of a core characteristic... ! There was a viral tweet from Andrej Karpathy to that effect. Something like "it's hallucination all the way down."
It's inescapable, coffee to the product/process. It's not a feature, because I can't really think of a scenario where it's desirable, but it's not a fixable bug, either.
Comments
Which now leads me to wonder about the 'rhetorical' potential of such 'hallucinations'/'glitches'.