same. and there’s a clear difference between gpt 3 and 4 for these kinds of tasks when i want to followup with a dialogue about the output if i still don’t understand
(i also successfully make use of q’s like “explain like i’m [age]”, “explain in 3 diff ways”, “give me a couple of analogies”, etc)
yes! learning to use it for me was a process of remembering that i can just ask it to do stuff like that. in other words to create a new intuitive category in my head beyond “mode of talking with non-chatgpt software” and “mode of talking with people”
gpt4 is like lifechanging good, I need a more efficient way to access it throughout the day than typing on shitty phone keyboard on a browser window tho
the webui on mobile is shit, people shittalk midjourney for using discord (100% that is not the best solution either) but it's a godsend to have a mobile-first interface
yeah i rarely ever have issues with hallucinations (with gpt-4). after working with it for awhile it's extremely easy to tell when it's bullshitting and one gets much better at prompting it to avoid those instances
I haven't put much effort into prompting at all. mix of getting a feel for what kinds of things it's good/bad at and maintaining a kind of back and forth dialogue where if it's hallucinating it tends to become incoherent
I feel like it wouldn't be that hard to use whisper and some text to speech library to make you able to talk to it back and forth by recording a voice memo that triggers a pipeline.
this really is the most amazing part of it. been using that feature to understand gut bacteria papers. so much bio jargon when parsed by chatgpt is actually understandable
yea, you need to have enough familiarity to be able to discern what makes sense and what doesn't. that said asking it to explain given material is much much less error-prone than asking it anything zero-shot
it's always been right and helpful for me as long as I'm asking for general knowledge and not for it to solve some kind of quantitative problem or tell me about a particular text
i am autistic but yes i think you’re probably right that people who are good at being explicit and precise with communication probably have an easier time getting what they want out of chat gpt!
a less obtuse way to state what I mean: it's hard to get people to talk about specifics. either because they don't have them available in their mind, or that they interpret it as a social attack to not vibe with claims but challenge them. there is variance, but other concerns get in the way
Comments
(i also successfully make use of q’s like “explain like i’m [age]”, “explain in 3 diff ways”, “give me a couple of analogies”, etc)