I actually think using “lie” or “hallucination” for LLMs isn’t as accurate as “malfunctions” or “glitches”

Comments