This is obvious when you get to read any system prompt.
Reposted from Isilanka
Can't shake the feeling that a lot of people working *on* LLMs (and not just *with* LLM) either do not actually get how they work, or are deliberately pretending they're working on a fundamentally different product: either way, it is in equal parts fascinating and deeply worrying.

Comments