Profile avatar
jasongulya.bsky.social
Professor of English and Applied Media and AI Consultant for Colleges, helping colleges look past the hype. LinkedIn: https://tinyurl.com/3txt65da Newsletter: https://open.substack.com/pub/higherai Work with me: https://tinyurl.com/2n4wvhsp
351 posts 1,894 followers 6,059 following
Prolific Poster
Active Commenter

The ongoing integration of AI and similar technologies into everyday life does not mean that every classroom needs to use AI extensively. In fact, cultivating AI-free learning environments could make students (and us) more aware of what’s actually going on in the world and our place within it.

I think AI-free spaces will continue to be powerful for learning. But… I think they’ll have to be consensual spaces, where students opt into, create, and maintain the AI-free space. Because I’m not sure if an imposed AI-free space will be viable for much longer (if it even is now).

If we want to talk seriously about AI and the future of assessment… We need to talk about alternative grading. We need to talk about whether deemphasizing grades through alternative assessment would better serve them, and help preserve classrooms as places of exploration and experimentation.

Don’t let companies and people hide behind the “neutrality” of AI. AI programs and tools are far from neutral. They are specific tools created by specific people for specific political purposes.

I'm all for teaching process over product. But... 1️⃣How do we create process-focused assignments without (in the push to standardize and measure) flattening out process? 2️⃣ How can we possibly do it for hundreds of students? There's a lot to work out.

Learning with AI is seductive, because it promises the chance of jumping past the arduous work of System 2 thinking to the quick, easy work of System 1 thinking But learning science and psychology show us that often learning doesn’t work that way.

Just got my copy of “Teaching and Learning in the Age of Generative AI” (Routledge). My chapter is called “The Age of Chat: Education and the Rise of No-Code Chatbots.” I *may* have launched a critique of some very powerful people (Altman, Khan, Andressen, Diamandis) in print, for all to see…

Evaluating tools is an essential part of modern writing. Some tools stand on their own. Some tools are embedded in other tools. Some tools are more anthropomorphic than others. Some tools are more intrusive than others. Some tools are more AI-centered than others.

AI is a cultural technology that intersects with discourses of power. In other words, it’s political. People and companies are already using AI to gain and maintain power over others.

If colleges ignore AI, they’ll lose their students. If they adopt AI uncritically, they’ll lose their souls. Then, they’ll lose their students. And it’s not about incorporating AI into all courses. It’s about designing AI-aware courses, even if we don’t use the technology.

One of my newest articles, this time with EdTech Digest. It’s important that we be very specific when talking about AI, Gen-AI, LLMs, and so on. www.edtechdigest.com/2025/02/20/t...

To boost our students’ AI Literacy, we need to stop throwing “AI” around as a label. We need to be more specific. We need to talk about LLMs. We need to talk about Gen-AI. We need to be clear about how/if our terms align with each other. Because the terms we use matter.

“In the Age of AI, the Humanities will be more important than ever.” Ok. But… Humanities programs are getting their budgets and staff cut every year. What happens when the skills we need the most are in the shortest supply, because of money and public perception?

"Can you really build a life when you don't know what is real and what is fake?" I read these words a few days ago. They were written by a High School student, wondering what AI meant for human connection and human purpose. I haven't stopped thinking about them since. More below. ⬇️⬇️

Earlier this month, the Arizona Dept of Education made headlines for approving Unbound Academy as an AI-driven charter school. Yesterday, the Pennsylvania DOE rejected their application.

This morning, I read a fascinating article. Mary Ruskell, a teenager, wrote about what it’s like to suddenly mistrust everything she reads and sees online. It’s so easy for educators to say “Students just need AI Literacy” or “just don’t trust anything you read online.” But we need to go deeper.

You can redesign your courses in light of AI, without using AI in your classroom. We can then redesign our courses around… ↳ Collaboration ↳ Relevant skills ↳ Student agency ↳ Intrinsic motivation ↳ Human connection ↳ Process instead of product Even if we don’t work AI into our classrooms.

If we simply work AI into an educational system without rethinking that system, education is going to suck.

Today, “The Brutalist’ was nominated for Best Picture. Some people want it disqualified for its use of AI. Here’s my breakdown: www.linkedin.com/posts/jason-...

This is what AI looks like in the Trump admin. Trump: Altman is the best. I’m going to give him and other $500B to build a super AI company. Musk: I’m going to rage-tweet, saying that Altman sucks and that OpenAI doesn’t have any $. Altman: I’ll tweet back. Come see my data centers, bro.

One of the worst use cases for Gen-AI I could imagine: an Anne Frank AI bot. Welcome to the future…I guess… Not sure I like it.

The rest of the world: OMG! TikTok is back! Me: You mean President-elect Trump. Don’t deprive me of those 24 hours of processing time.

We need to think a lot about the boundary between support and infantilization. Is AI supporting our students, infantilizing them, or both?

I have many thoughts about Unbound Academy, the charter school that was recently approved in Arizona and that will operate “without teachers.” One question. How different is the model, for real? For a school “without teachers,” they certainly do employ a lot of teachers.

Here's the best way I can describe teaching during the AI storm. I'm in a constant state of developing: 1️⃣ Immediate stopgap measures that get me through the day 2️⃣ Long-term plans for what I think teaching will look like in 5 years. Ideally, #1 would lead me to #2.

I’m reading 3 books right now. 1️⃣ Martha Wells’ Murderbot Diaries series (2017 - Present) 2️⃣ Ishiguro’s Klara and the Sun (2021) 3️⃣ Bornet’s Irreplaceable (2024)

AI comes up at literally every faculty meeting I join. It’s taking up too much oxygen. We need to decenter it so that we can think about it more holistically. The technology is a chance to rethink what we do. But “what we do” should be at the center. Not the technology.

I’m excited for my book to come out soon. Here’s the big vision. Right now, the most powerful (and underrated) use of AI is to actually follow evidence-based principles we should’ve been following for a long time.

With the rise of AI, the ability to speak and communicate is more important than ever. This semester, my Berkeley students will be building their meta-awareness of the way they speak and communicate. Here’s one activity we’ll be using. ⬇️⬇️

I think Humanities courses will need to teach both lateral reading and vertical reading. Lateral Reading = reading the text and stepping away from it to verify its credibility and compare it to other texts before continuing. Vertical Reading = diving deep to better understand how the text works.