John Warner
@biblioracle.bsky.social
9.5K followers 1.4K following 3.3K posts
Writer, speaker, consultant. Chicago Tribune columnist, blogging at Inside Higher Ed. Coming soon, More Than Words: How to Think About Writing in the Age of AI. Previously: Why They Can't Write and The Writer's Practice. biblioracle.substack.com
Posts Media Videos Starter Packs
biblioracle.bsky.social
This really does seem to be the bet they're making, and not just financially, but with our environment, society, etc...When it bursts we'll see all kinds of people saying in hindsight it was madness, but we can also say this with foresight!
Reposted by John Warner
drewmckevitt.bsky.social
The generous reading is they believe there is no limit to the money worth dumping into any project that could lead to a future AGI, because the AGI will be essentially invaluable. But this is the equivalent of building the world's largest economy around the expectation that the rapture is imminent
biblioracle.bsky.social
We know that LLMs struggle with math. Apparently the people involved in delivering the data centers that power LLMs do too. There's no way that investments will pay off. The bubble of bubbles. futurism.com/future-socie...
AI Data Centers Are an Even Bigger Disaster Than Previously Thought
An investment manager realized he made a crucial mistake — and that his grim prediction about AI investments may not have been cynical enough.
futurism.com
Reposted by John Warner
bcnjake.bsky.social
Or anything, really. Passing your grading off to AI is simply deciding that your students’ work isn’t worth engaging with. And if that’s how you feel, why on Earth are you in a classroom?
biblioracle.bsky.social
I strongly urge everyone to not just read this warning from @marcwatkins.bsky.social, but heed it, and be vocal and forceful pushing back against using AI to grade student writing. This must be anathema if we're going to have a world where learning means something. substack.com/inbox/post/1...
The Dangers of using AI to Grade
Nobody Learns, Nobody Gains
substack.com
Reposted by John Warner
mcsweeneys.net
FREDDIE: I got Geese.

EDDIE: Goose?

FREDDIE: No, Geese.

EDDIE: More than one?

FREDDIE: It’s just the one. Geese.

EDDIE: I’m saying. Geese is more than Goose.

FREDDIE: I’m glad you think so. I tried for Goose, too, but no luck.
Freddie and Eddie, the Nation’s Leading Temu Abbott and Costello, in “Dat Bird!”
“Goose is a jam band. Geese is indie rock. They both have new albums. They’re also both on tour. Confused? We can help.” — New York Times - - -FRED...
buff.ly
Reposted by John Warner
biblioracle.bsky.social
I strongly urge everyone to not just read this warning from @marcwatkins.bsky.social, but heed it, and be vocal and forceful pushing back against using AI to grade student writing. This must be anathema if we're going to have a world where learning means something. substack.com/inbox/post/1...
The Dangers of using AI to Grade
Nobody Learns, Nobody Gains
substack.com
Reposted by John Warner
allystewart.bsky.social
I ask my students not to use AI because it means they are not fully engaging with their own learning. If they do, it’s a betrayal of my trust.

I don’t want to use AI to grade student work because it means that I am not fully engaging with their learning. If I do, it’s a betrayal of their trust
biblioracle.bsky.social
Beck is a weird cat. I think he's sincere in his beliefs, but he has that radio host vibe always working in the background, so the whiff of grift is ever present.
biblioracle.bsky.social
This seems like an odd thing to say given that Charlie Kirk had already supplanted and surpassed Beck as an influence on the right and republican politics. It's like some washed up pop star declaring they're going to get Taylor Swift to replace them.
biblioracle.bsky.social
Sign me up for whatever group wants to talk through these problems, particularly if we can use articles like this to help our thinking.
biblioracle.bsky.social
Yes, exactly. It's managerialism imposed on education. Not a new trend by any stretch, but a kind of acceleration of those trends with the potential to get to a very bad endpoint.
biblioracle.bsky.social
Inspired by @marcwatkins.bsky.social and wanting to share his piece with my newsletter subscribers, I put my two cents in. open.substack.com/pub/engagede...
biblioracle.bsky.social
I don't think it's about fucking. We already approve of making babies without fucking (IVF). But the downstream effects of creating biological copies of existing people are profound and thankfully society recognized that those downstream effects aren't worth the scientific accomplishment.
biblioracle.bsky.social
People have to take charge of their own agency to address these questions. I also was once bored with reading the same stuff so I started figuring out how I could change my approach so I wouldn't have to read the same stuff! This is the job!
biblioracle.bsky.social
It's truly infuriating. That institutions have not only invited this stuff in, but are pushing it on faculty and students is a tragedy and that the people doing the pushing don't see it that way is really just a sign of a deep sickness.
biblioracle.bsky.social
I try not to be needlessly confrontational or cruel, but if someone tells me they're outsourcing the work of grading student writing to AI, I tell them they should either stop or quit their job because they shouldn't be doing it. It's malpractice and not good for their own long term happiness either
Reposted by John Warner
mrneibauer.bsky.social
AI is not a mere tool for helping teachers. When you remove the human element of teaching and learning, you are not being more efficient nor effective. The ability to learn is what makes us human, and humans teaching humans is a fundamentally human act that is necessary for meaningful engagement.
biblioracle.bsky.social
I strongly urge everyone to not just read this warning from @marcwatkins.bsky.social, but heed it, and be vocal and forceful pushing back against using AI to grade student writing. This must be anathema if we're going to have a world where learning means something. substack.com/inbox/post/1...
The Dangers of using AI to Grade
Nobody Learns, Nobody Gains
substack.com
biblioracle.bsky.social
Not only will AI grading be the end of teaching because of the labor dynamics Marc covers, but it kicks off a process of what I call "self-alienation" where the teachers gradually remove themselves from the essential human experiences of their own work.
biblioracle.bsky.social
I've been traveling doing many talks to schools and universities about the teaching writing at this time and I have a few hard lines I draw, AI grading is the biggest one. We just cannot give ourselves over to this. It will be a disaster. It will be the end of teaching.
biblioracle.bsky.social
Mark is open to some of the experiments being done where AI feedback is integrated into a process where the product is ultimately assessed by a human, but cards on the table, I reject this as well. LLMs don't read. Sending students to it for feedback, to me, signals the wrong things about writing.
biblioracle.bsky.social
Educators should look at AI grading of student writing in the same way collective society looked at the potential for human cloning after the arrival of Dolly the sheep, something that is an affront to our humanity and must be rejected completely. There is no rationale to support this.
biblioracle.bsky.social
I strongly urge everyone to not just read this warning from @marcwatkins.bsky.social, but heed it, and be vocal and forceful pushing back against using AI to grade student writing. This must be anathema if we're going to have a world where learning means something. substack.com/inbox/post/1...
The Dangers of using AI to Grade
Nobody Learns, Nobody Gains
substack.com