Lack of any in-text citations is a big one. It also just has a telltale sound to it. I teach high school and use Google classroom, so I’m able to see the document history on students’ submitted work. The biggest clue is that they’ve pasted in huge chunks of text in addition to all the other stuff.
Inability to replicate their writing in a lockdown browser exam. The price they pay is that exam score which is weighted heavier than the daily work they did with AI.
Human brain, most of the time.
Tortured phrases, especially outsode of scientific paper (these do employ them due to fomla requirements of translation or standard).
It likes numbered list. Also most of the time it doesn'thave memory to generate whole work so there is obvious style shift.
Quite a few times I spotted ChatGPT to copy source vebatim but with irrelevant alterations. In one case it was my own text pubished on StackOverflow.
There are a number of tools online which can evaluate probablity of text being generated.But it's less effective than image generation expertise.
Biggest problem that there are NN-power editors, style and spell checkers, built-in into various editing tools. The result of their work makes text to be marked as false positive by these experts.
I'm trying to redesign assignments such that ChatGPT isn't (as) useful or helpful.
Trying to spot the users through submitted work seems like a mug's game, as well as counterproductive, pedagogically (in part bc it adds new layers of classroom trust-corroding-yet-ineffective surveillance).
It has a distinctive style and it will usually make substantial mistakes with citation (or fail to do it at all). But there's no point focusing on spotting it because it's impossible to prove to the satisfaction of a disciplinary committee, so I mostly just give mediocre grades and move on.
Not an educator, but as a recent college graduate I could tell when others were blatantly just copy and pasting a ChatGPT response because the default ChatGPT voice is pretty distinct, it loves to end paragraphs with "Additionally," and call things diverse and complex.
My brother teaches high school and he checks typed work against a hand written rough draft and the other work that a student has produced. He has found that the existing tools that claim to flag LLM text tend to flag ESL and autistic students work while missing LLM text he generated himself.
Nothing specific, ChatGPT merely accentuates existing bad patterns in my experience. Grading criteria that penalize vagueness, overly long answers, extreme hedging, etc. will also penalize ChatGPT use.
Requiring citations, especially citations from specific works that I am familiar with. "Your essay on WWI has to quote from the 'Place in the Sun' speech by Kaiser Wilhelm II."
Look for fake sources. It tends to produce stories with basic, bad titles that are fake, with either hallucinated authors or authors that don't write in that field. This admittedly works best on current events papers.
Comments
Tortured phrases, especially outsode of scientific paper (these do employ them due to fomla requirements of translation or standard).
It likes numbered list. Also most of the time it doesn'thave memory to generate whole work so there is obvious style shift.
There are a number of tools online which can evaluate probablity of text being generated.But it's less effective than image generation expertise.
Trying to spot the users through submitted work seems like a mug's game, as well as counterproductive, pedagogically (in part bc it adds new layers of classroom trust-corroding-yet-ineffective surveillance).
It’s easy to fool one person grading a paper. It’s harder to fool a classroom full of your peers asking questions.
Learning means absorbing a concept well enough to teach it, not just regurgitate it.
Make them teach what they learned.
“Captivating story telling”
“Engaging premise”
If it sounds like a movie trailer, it’s ChatGPT
In earnest, one way is to look for overly complex sentences.
another is to run a plagiarism check. ChatGPT is bad in citations.
Another is to quiz the students on their essay.
Honestly at this point it's just gut feeling. It feels sterile and samey. Also, it uses a lot of bullet points