This can only end badly. Just in the last day, I've asked ChatGPT for suggestions about metal-frame glasses of a certain shape (it gave a list of acetate frames with different shapes), and then it hallucinated a few books that don't exist but sound plausible. Thanks, ChatGPT!
Iβm 65 and have been using it for several months. Mostly just playing around with it to see what it could do. But I have recently cancelled my subscription and deleted the app for a number of reasons β basically I no longer trust it.
My mother wanted to know how long she had to wait to drive after having a procedure done. ChatGPT told her 2 weeks. Her doctor told her 2 days. She told me "ChatGPT was wrong!" I could only say, "Yes, mom, it certainly can be."
Comments
I also asked it to help me reformat a list of words into an Excel sheet.
I never ask it to generate anything on its own.