if i ask you “How many bones are in your hand?” and you answer “27”, that’s because at some point you read that fact, not because you can introspect and sense how many bones you have.
Asking an LLM about its training data, or the probability it assigns words is just like that.
Asking an LLM about its training data, or the probability it assigns words is just like that.
Comments
I’m trying to sense how many bones there are in my hand, and getting a nonzero answer —because the finger joints are pretty clear, as are some of the hand bones. I’m not sure what that analogizes to for LLMs though.