I guess I’m not comfortable with the term because it feels like whitewashing (“it’s not wrong, it hallucinated!”) and it contributes to personification. It’s not a quirky robot friend that gets mixed up sometimes and shouldn’t be treated as such.
Ah, yeah, I think I understand where you're coming from. If it helps any, the term is from the engineering and development teams and I don't think it was generated through PR.
For example, there's another fail state identified as "Fabrication" where the AI makes something up based on a request.
Comments
A hallucination is when an AI, without being directly promoted to fabricate, generates erroneous information with certainty.
For example, there's another fail state identified as "Fabrication" where the AI makes something up based on a request.