I know a lot of the misconceptions about LLMs are just political conflicts playing out, but I wonder if we couldn’t have avoided some of the bullshit if we had understood this technology as “guessing” rather than “answering”

Comments