Predictive text models do not ever "answer your question"

They predict what an answer to your question would probably look like.

Which is very, very, very different
Reposted from Katie Mack
I don’t think it can be emphasized enough that large language models were never intended to do math or know facts; literally all they do is attempt to sound like the text they’re given, which may or may not include math or facts. They don’t do logic or fact checking — they’re just not built for that

Comments