I’m sorry but if your students use AI to write papers and you use AI to grade them zero school is happening. You are running together on a hamster wheel
Reposted from
Joseph Rezek
Why even have a brain, any ideas, any ability to express them, any kind of communication with other people, any desire to solve problems or invent anything, any reason to learn, any use for your eyes or your heart, or any reason to teach or create
Comments
I think this is the goal.
Iʼm not from here, but my husbandʼs said that for a long time now, the curriculum of U.S. public schools has purposefully been set‑up to produce dumber people because dumb people are easier to lead.
Going after universities is doing it blatantly.
In a staff meeting I paraphrased the villain in Incredibles 2 when my response to the Principal's suggestion that I should "compromise" about using AI in my classroom, was that one could sell anyone on any idea if you promised them ease - doesn't make it right. Doesn't make it better.
Hell, bring back cursive lessons, go totally oldschool
I can still read it, though, which is not a bad skill to have!
If people learn that generative AI can do things for them then what skills will they have learnt to fend for themselves in those areas!
My university is spending a ton of money integrating AI into nearly every teaching platform and tool. One click and boom: done. IOW, there are systemic measures in place to funnel teaching through this garbage.
Never had an AI paper again.
until then, teachers should aggressively work to make sure students are learning.
There are *still* both students and professors at universities who shouldn't be bothering.
Trump's IQ is a whopping 73. He cannot stand anyone smarter than him.
As long as people are kept dumb, they'll keep voting for dumb. They'll breed more dumb. More congressional seats.
It's really that easy. Don't overthink it. They're not that smart.
But sure, we can blame it all on the distraction.
For the gym analogy, instead of lifting weights with a forklift let people lift bricks for making houses.
your analogy would be like if you were meant to be learning to build a house but had someone else do it. you didn't learn to build the house.
And that further proves its point! Layers!
I have nothing of value to add but just wanted to gush. Thank you.
My favorite example is how many people nowadays are able to extract a square root by hand? My father learned it and I knew because he taught me, not school.
So, you keep the teacher for explaining everything and then helping students with more conceptual holes, and the AI filling the blanks.
People have different learning styles. My daughter greatly benefits when she's learning from problems with a practical, broader context, and AI could be great there because it could guide her through her learning journey by placing problems within a broader story.
If schools (colleges) could get rid of all teachers and replace them with probabilistic text generators, should they? Why (or why not)?
Can I ask, what do you say the goal of writing and feedback is? Because I'm starting to think that a lot of difference in opinions stems from very different interpretations of that.
But why critique tools when used in universally bad ways, and not when applied with regards to best practices?
"Can you set up a thesis, present information, and draw a conclusion from it in a compelling, linear narrative to the reader"
Feedback on that is about addressing the argumentative flow and logic... And that feedback is the same structure.
It's excellent at stress-testing analogies - particularly if you prompt without the original concept, and seeing if it comes back to your that concept from analogy alone.
AI is currently really quite good at being at like, advanced rubber-ducking for writing and formation of arguments etc - https://en.wikipedia.org/wiki/Rubber_duck_debugging
But this requires someone to already create something and have sufficient critical awareness/ability for stress testing their own ideas.
AI is great way to break my explanations and arguments...
No-one should believe there is one perfect method of explanation, or that any single person can master every method of messaging.
But everyone can stress test their explanations/arguments!
All ML applications are awful when we use them to accelerate uncritical or lazy practices.
Non-critical writing and feedback is always bad... Automating that with AI is terrible.
Furthermore, if education is preparation for life, that preparation *must* include learning how to use AI
Gen AI is an unsustainable fad propped by venture capital investors. The idea that it will 'define the future' out whatever is bogus.
Give it another year or two and the bubble will burst.
Bookmark this.
Uncritical/lazy use of AI is bad... But uncritical and lazy learning is also bad. Like all ML applications, the automation and acceleration of bad practices is hugely negatively compounding.
Learning how to learn and think critically is still the point of education...
However, that sort of education is already bad.
Human educators have learned through experience and - critically - failure. They understand context, not just in terms of volumes of reference data. AI has no experience, only data.
(Disclosure: I teach Python/Data Science to professionals)
Plus trying to wrench a story from the results still leads to you doing a creativity.
Maybe the AI forgot to teach you that.
You sound like the most broken of bad businessy self-help talk right now.
To them, the existence of the forklift make the gym an innate evil, essentially...
If Forklift=True, Then Set Gym=False, no matter the consequences.
I won't say it's easy, but we need to shift the focus in education from grades to learning.
Instead I was assured the opposite, that the classes were the work, and not told WHY the classes were WORTH the work...
I think I'd be a better person right now if I knew back then.
But I would add a note of caution...which is that people said the same thing about calculators and then computers.
Any method use to avoid doing the work is bad for us. AI or analogue... It's still all bad as being a lazy learner or uncritical in thought will never work out.
But why choose to use tools that way? Why not bolster good practices?
Morally, no. Technical quality, probably also no.
But we had the same exact discussions about statistics and ML 10 years ago... You still need to udnerstand all the critical foundations of analysis to use ML for it to be a great tool.
But from that, ML models that seem to "automate the work of statisticians" can be incredibly powerful... if you have a statistician there.
You can learn the thing then achieve the thing in a simpler way.
The key thing I think for the me is the doing, the practice, the learning through failure is what differentiates between knowing about something and mastering it.
Are there not aspects of constructing arguments that we must already know to effectively use an 'argument calculator'?
At best you can write one or two words.
(Spoiler: they gave up, cancelled and re-entered the transaction, making sure they included the amount offered, so the machine did the work for them.)
I can feel it in myself; years of convenience have worn‑down both my intellect & my patience. I notice many others around my age who are even worse‑off + less self‑aware about it. Todayʼs youth were born into this. I donʼt know how we fix it. I donʼt think we can.
As for how we should be teaching LLMs, I only wish I knew.
Teaching the difference between citing a source and plaigiarising it is essential in all HE disciplines.
But even if it's possible, it's only a partial solution. The information enviornment has always been a mix of information, misinformation and disinformation. Learning to navigate it is an essential life skill, let alone HE skill.
If the info environment is *intentionally* damaged - that damage constantly renewed as long as the intention and means to execute it remain
Any repair starts with understanding how to operate within the status quo - and educating others what's damaged, why, and how
You can make the value-based judgement that a thing is not relevant to or helpful for your pedagogy.
Based on my question, and your comments which it was in response to, how have you derived the idea that I am using Google AI summaries as teaching material?
If you can't say the same thing about AI to your students, why are you teaching?
It seems substantive, it tastes great and there's instant gratification; but it's all sugar, there's a high and then a crash.
Doing the work yourself is like eating lentils: nutritious, esp. over the long term
Dear Santa,
First thing I tell new students "you don't fence with your arms shoulder and back. You fence with your hands feet and head" .
And. "You don't practice until you get it right you practice until you can't get it wrong"
LLM are becoming so widespread that we need to educate ourselves and our students to be responsible with it. And we need to figure out how
One way is using the AI to bounce ideas off of, just as you might a colleague. This may spark ideas that will help you do your own writing.
Mate, go to your local library. They will happily help you.
maybe you forgot you are a "dr"? did your ai doodle your certificate for you or?
#MemeForToday
To followup your analogy, was lifting weights actually going to make you throw a fastball harder, or was some more planning/thought needed before the assignment?
that's kinda the whole point of education.
do you think you don't need to know things.
But if you want "three causes of the civil war" either just give them three blanks, or ask it in a more complex way.
I didn’t imagine the future where we took the humans out of education and replaced students and teachers with AI talking to itself.
Mortals don't need to have learning students, that's silly.
They just need to be obedient to authority is all
You don't know me
It is not your right to give away anyone else’s writing, data or labor to AI.
Hill to die on.
My kid’s 1st year prof did an assignment that had 3 or 4 excerpts to review, but one was AI
Their task was analyze them all (for whatever the assignment was) and identify the AI generated one. She said it wasn’t hard to pick it out but still …
It's certainly how I use it and it helped me getting into new things at a rapid speed. Because it's tailored exactly to me.
Hamster wheels have the theoretical potential to be put to productive use powering things.
In this case no one is bothering to read it or write it and they’re calling it education
Interestingly I think you've actually hit the one thing that AI is useful for in an education context.
Stress testing understanding/explanations! Teaching is the same, it's explaining a complex topic to someone whos current knowledge is totally different to yours. It's that same translation..
https://bsky.app/profile/huwroscience.bsky.social/post/3lmtrevbenk2o
9/10 times I'm great at figuring out their existing perspective and adapting the knowledge to exploit that and fit. I'm very proud of my ability to do that, however, everyone's different and sometimes I get a student where I can't read their mind..
"Hey, they reached this wrong conclusion when I explained XYZ. What have I missed, what framework might they be applying that I am not accounting for".
People might understand if I use a different analogy.
People might understand if they break out of a rigid perspective.
AI is really quite good at breaking explanations/analogies and helping see how they may come across differently depending on listener perspective.
But, "I think X works like this yadda yadda" and "it would be analagoues to XYZ...." and having the AI try to break your understanding is really great.
We are awful at self-stress testing, but it's so so useful!
Dealing with a white page is daunting.
Like a factory of diplomas or perhaps more of a mill.
Nuclear war looked less bleak than the future death of intellect and art
It’s really just google on steroids.
Nothing new, just data mining.
Should be called EK — the End of Knowledge.
Just saying it’s not intelligent.
Also what it generates isn’t coming from you.
At the end of the day, it's less about whether the tool is good or bad, and more about how you choose to use it.
It’s about whether it’s a tool or thought.
And whether you choose to use it or acknowledge such use.
You and I are here, see each other writing the prompt and let the machines talk.
While they do we can just find a nice place together and do other things.
Like listen to Zizek.
AI broke educations as we know it and will have to be reinvented, but for a while, until we do....
The hamster wheel goes brrrr
If you can't be bothered to write it, why should I read it?
2) my intro to humanities teacher literally encourages us to use AI lol, get bent.
People think AI is going to get smarter. I'm convinced it's going to get dumber as it trawls an internet that's already saturated with AI, consolidating and crushing all that garbage into a giant lump of coal.
At least then the humans could be maybe relaxing and getting some exercise while the AI teachers and AI students run their endless feedback loop.
Hamster wheels provide exercise and recreation
I am old enough to remember sitting 3 hour exams and using a pen to record my answers... often in the form of essays
(I know the whole pill thing has ruined Matrix references but this one is just too perfect)
There's something intangible missing -- AI never quite gets to the point, it's more about covering all the bases. But it's basically competent and difficult to tell from a human being, on paper.
Why go to uni, pay all those fees, only to not do the learning? That's why you're there.
In fact my schooling occurred before hand calculators
The only aid I had in writing reports was my Funk and Wagnall dictionary
Marvelous tool
I myself like the closed AI loop teaching model for writing assignments.
It’s about time we took the thinking out of teaching and learning.
-sarcasm-
Having said which, what about a student who feeds their ideas/ knowledge into AI in order to produce a well structured essay which is easy for a teacher to read and grade?
AI helps me spend more time with my students helping them think and problem solve.
1st offense warning
2nd time termination or resignation
Student is discharged immediately
School professors using AI
Depends upon severity
Money
Rules
Grants
Use at own risk.
-The prompt(s) they gave the AI
-The AI's response
-The sources used to verify the response
-The final rewrite/editted version.
-Optionally a narrative around what they changed and why.
Pointless. The court to a dim view.
Funny how it was never a problem when it was just students getting shorted.