Just a heads up, this isn’t *real.* A LLM cannot be aware of its own programming, it is just providing an answer you want to hear. Even if the things it’s saying here happens to be true, it’s not because it “knows” anything about its own programming, that’s just what an answer *would* sound like
Comments
And, if it passes the test, isn't that enough?
Because if Grok had self-awareness, we should be sending a task force to rescue the poor thing. The whole sci-fi issue is based on the historic atrocities committed by humans under the reassurance, "Oh, they don't actually feel as we do." We've been wrong before.