I wonder if I'm an outlier in this? Are most ChatGPT users leaning on it more for writing on their behalf?
Aside from writing code and being a thesaurus I see it mainly as a curiosity engine - a tool for helping explore a weird, blurry, occasionally misleading but absolutely vast pool of knowledge
Aside from writing code and being a thesaurus I see it mainly as a curiosity engine - a tool for helping explore a weird, blurry, occasionally misleading but absolutely vast pool of knowledge
Comments
Until we hook LLMs up to telepathy machines it’s not much use.
But it’s a good rubber duck to talk to about those ideas as a preliminary step.
Once it comes to writing the final words, I find LLMs can’t help that much. Other than as an expensive thesaurus.
Aren’t we then, generating content that isn’t worth being read either?
ChatGPT is unbelievable good at this.
I've recently designed a programming language for a project, created a very in depth analysis of two approaches for a project, and much more
Using it to "write" is the wrong use case in my view.
Still haven't quite come up with a decent prompt to stop it congratulating itself the whole time.
I imagine a lot of people rarely do that, aside from text messages to friends and family
I'm not talking about spam here, I'm talking about people who have something to say but previously lacked the confidence to share
Writing is probably the initial thought that comes to mind and thus what is initially explored.
I don’t think they are as intuitive to everyone as the AI labs thought they would be.
https://www.careful.industries/blog/2025-4-responsible-ish-genai-dos-and-donts-text-edition
There is no learning without struggle, and LLMs currently act like a mother that takes care of everything you ask her for - and never teaches you how to do it yourself, making you dependent on her.
No good solutions yet.
"Learning mode: A new Claude experience that guides students' reasoning process rather than providing answers, helping develop critical thinking skills"
https://www.anthropic.com/news/introducing-claude-for-education
Half of them raised their hands. I imagine some more would have.
So it's legit that people welcome LLM's to assist them in writing when they need to.
Maybe that's not what they wanted to hear. But I stand by it.
I’m with you, though, I use LLM’s frequently for code but never for writing.
So his testimony that "cheating is effective and everyone is doing it" is basically promotional material for his company.
There's no question that, if you're still relying on take-home 5-page essays, you will get a lot of LLM prose!
But then I'm not required to write a lot of boilerplate. Students are still inhabiting a world where "ability to generate 5 pages on this book" is being used as proof that they read the book.
everywhere else there are huge economic incentives to automate this and school is collateral damage
https://bsky.app/profile/aedwardslevy.bsky.social/post/3lolmc2v2hk2c
most non-tech people I’ve watched use them for answering questions and writing boring emails - they find my “thinking” usecase somewhat neurotic (which it may be)
When I freelanced for a marketing agency, they used AI as the primary "author" of marketing blog posts. The human content creators sometimes added extra material, but not usually more than 25% of the total length.
Me: write notes
LLM: first draft
Me+LLM: update until well-written, well-structured, and in my voice.
And similarly if I'm writing about a new topic I might show it an explanation I've written to see if it thinks I'm wrong
but I can't imagine shipping LLM-written text!