"Chat GPT told me that it *can't* alter its data set but it did say it could simulate what it would be like if it altered it's data set"
NO. It has no idea if it's telling the truth or not and when it says "I can simulate what this would be like"
This guy is pretty sharp about philosophy but […]
NO. It has no idea if it's telling the truth or not and when it says "I can simulate what this would be like"
This guy is pretty sharp about philosophy but […]
Comments
It might even give you a good run down of how LLMs work mixed in there […]
* = multi-layer self-referential language about provenance of language, epistemology, etc.
Maybe but that still implies some kind of organization of concepts beyond just through language or the shape of their output.
I don't see any reason why it should be impossible to design a program with concepts, that could do something like reasoning ... you might even use an LLM to […]