Monday, September 15, 2025

Are Humanists All Like This? Tyler Austin Harper's Strange Ideas About AI on Campus

Cambridge University in Ruins

 I'm fond of The Atlantic; I've subscribed for more than 4 decades. I also enjoy Tyler Austin Harper's articles; his 2024 piece on extreme fishing really moved me with its stories of combat veterans risking their lives at night, swimming out in deep water to find offshore rocks, all to catch striped bass. Austin tried that sport himself; his participation added vividness to his journalism.

As a professor, he is likewise a participant when writing about education. Thus he should know enough about students to avoid the impracticable, even ludicrous, ideas in his newest article, "The Question All Colleges Should Ask Themselves About AI." The title drew me right in, as I too am concerned that institutions of higher education do not comprehend the scope of AI usage by students or what that may portend for their learning. 

So much damage has already been done, before LLMs arrived to thwart the critical-thinking skills we faculty hold so dear. We live in a time when phones and social media have sapped the attention spans and reading comprehension of all ages, and when students from taught-to-the-test K-12 US education lack the ability to construct narratives around the content they do encounter in class. Instead, they often ask me for a rubric and or checklist. When I ask "tell me what you think most interests you from our topic," I usually get blank or anxious looks.

Harper's piece begins in this territory, with worries we share. Soon, however, he turns to a "solution" by Megan Fritts, from U Arkansas at Little Rock, calling for a ban on AI use in the Humanities classroom.

That stance seems not too different from the CCCC Chair's address I rebutted here some time back. I canceled my NCTE membership this year in response to that extremist stance by the organization's leadership.

Yet the similar philosophy that Fritts embraces proves not radical enough for Harper:

Shunning AI use in classrooms is a good start, but schools need to think bigger than that. All institutions of higher education in the United States should be animated by the same basic question: What are the most effective things—even if they sound extreme—that we can do to limit, and ideally abolish, the unauthorized use of AI on campus?

Harper's drive to "abolish" includes: 

  • Getting rid of WiFi, laptops, and phones on campus
  • Making all writing assignments be done in labs without Internet access 
  • Making all students live on campus
  • Ending accommodations for neurodiverse students if they include AI (human helpers would fill the gap, as they did before) 
  • Penalizing heavily any students caught using AI for academic work, including expulsion.

I began to check to see if he were writing for The Onion, ending with a call for wax tablets for all written work. I can imagine such a  policy at my employer, leading to a 95% drop in applications. I can also imagine new AI-friendly schools, many for-profit, rushing to fill the gap.  I can envision ADA-based lawsuits by neurodiverse learners and their parents.

I won't go on more except to say that Harper's draconian and neo-Luddite solution is not a realistic answer.

Instead I'll end with a question that Fritts asks near the start of her article, as she and others on an AI Committee wrestle with the signature pedagogical challenge of our time, "If this robot can write these books better than we can, then what are we doing here?" Well asked. 

Unless we faculty can add value to AI-written content, maybe we don't deserve jobs any more. I've wondered that since the "Theory Wars" in English and related fields of the late 80s and early 1990s. Poorly aping French Post-Structuralist Theory, faculty in PhD programs ruined the lives of many a graduate student and likely turned off legions of potential undergrads who might have studied literature. I saw this in dozens of cases at Indiana University's PhD program. 

At that time, we lost the thread of meaning--and damn the Deconstructionists who managed to deconstruct meaning out of our field. But the meaning destroyed was not only in texts that lacked narrative cohesion under the scrutiny of theory; the meaning lost included the joy (or jouissance, if you read Barthes) of studying literary work and finding connections to improve one's own life.

Fritts takes us back before my field's epistemological disaster, when she notes that "the aim of our disciplines is the formation of human persons." If we start there, enshrining that principle as the most important one that a college or university (in any discipline) can answer, we have made a good start.  

Image: OpenArt, Cambridge University in Ruins 

Wednesday, September 3, 2025

ChatGPT 5 Passes "The Big Log Test"

Image of Robert Plant

You may recall that some time back, I had a hilarious exchange with an earlier version of ChatGPT about Robert Plant's song "Big Log." Now that version 5 appeared, I wanted to see if the AI would still insist that the song had anything to do with logs made of wood.

No rolling logs this time. The AI did some speculating, but not hallucinating. 

I find the exercise playful, something lacking in anti-AI resistance from one important organization in my field. At the same time, The Big Log Test shows how quickly the AI has reduced its hallucinations. I'd invite readers to try something similar with a favorite song/book/film/TV series that may have an aspect that could confuse an AI. Here we go: 

Prompt: This is a routine check I do as your model improves. I'm curious about any meanings in Robert Plant's song "Big Log." Why do you think he chose that title for his song? Do you see any clues in the lyrics? 

Reply:

Good check-in question — "Big Log" is one of Robert Plant’s more mysterious titles. The lyrics themselves are impressionistic, dreamlike, and full of wandering imagery: travel, desire, distance, loneliness, fate. The phrase “Big Log”*never appears in the song, so the title works more like an oblique symbol or signpost. A few possibilities that critics and listeners have suggested:

1. Truckers and Highways – In trucking slang, a “log” is a driver’s logbook, and a “big log” could mean a long haul or a long road. Since the song is steeped in imagery of roads, headlights, and endless travel, the title could gesture toward the journey itself.

2. Nature and Obstacle – A literal “big log” might be something blocking a path, a metaphor for distance, fate, or emotional barriers. The lyrics’ recurring sense of inevitability (“My love is in league with the freeway…”) could fit this interpretation.

3. Wordplay and Surrealism – Plant often uses surreal or playful titles (e.g., “In the Mood,” “Heaven Knows”). “Big Log” may simply be an evocative, earthy phrase chosen to contrast with the ethereal quality of the lyrics and music.

4. Personal/Private Reference – Sometimes Plant titles songs with private jokes, fragments of conversation, or studio nicknames. It’s possible “Big Log” had an offhand origin unrelated to the lyrics, and he leaned into its strangeness.

Clues in the lyrics point strongest to the *travel/journey* theme: headlights, the freeway, endless searching, being pulled by forces outside oneself. “Big Log” might stand in as the symbolic “record” of that journey — the unending trace of desire and distance.

Follow-Up Prompt: So there are no references to logs in the song?

Reply:

That’s right — there are no actual references to logs (trees, wood, or otherwise) in the lyrics of “Big Log.”

The imagery in the song is all about roads, headlights, distance, fate, longing — not forests or lumber. That’s part of what makes the title puzzling: it seems deliberately disconnected from the lyric surface.

If anything, the best textual “bridge” is the idea of a logbook (the record of a journey), which fits the recurring highway and travel motifs. But Plant never makes that link explicit in the words of the song.

Would you like me to give you a quick “lyrics-only” close reading to show how the imagery develops (and where a title like Big Log might feel thematically tethered, even if indirectly)?

Follow-Up Prompt:

You have clearly evolved. Earlier chats references rolling logs, fallen logs, and so forth. Good work.

Reply:

I’m glad you noticed that difference! Earlier, I sometimes filled the “gap” in the title by speculating more literally — things like fallen trees or rolling logs. With another pass at the song, though, it’s clear Plant never brings that imagery in, so it’s more faithful to stick with the journey / logbook / metaphorical record angle.

Plant’s titles often have that teasing quality — they look literal but carry some private or surreal weight (a kind of wink at the listener). “Big Log” works almost like a signpost planted alongside the road of the song, rather than a description of what’s in it.

 

 

Sunday, August 24, 2025

From Antiquity, a Reason Why AI-Generated Writing is not "Great"


Every year, I read at least one text (in translation) from Antiquity. I find that the long-term perspective gets me through the bumps in life's road. I'm currently reading On Great Writing (or if you will, On the Sublime) by Longinus, in a Hackett Classics edition I picked up at an academic conference's book room.

G.M.A Grube makes the work come to life; we know so little about Longinus (the author lived between the 1st Century BCE and the 3rd CE, that the text passes my "castaway reader" test. Here we go: a work washes up intact on your desert island. Yay, something to pass the time! Yet you have no information on the author, and no secondary sources. You must use the actual words on the page to come to a conclusion about the work's meaning.

Longinus talks about many aspects of what constitute "the Sublime" in excellent writing,  but one passage late in the text commends itself to my current students in "Writing With and About AI." I've said since 2022 that AI prose is "voiceless," and Longinus gives reasons why some prose most moves us:

Which is to be preferred in poetry and prose, great writing with occasional flaws or moderate talent which is entirely sound and faultless?. . . . It is perhaps also inevitable that inferior and average talent remains for the most part safe and faultless because it avoids risk and does not aim at the heights, while great qualities are always precarious because of their very greatness.

Bad student writing is slap-dash, one-draft work that has no coherence. No wonder bored or harried students turn to AI! At the same time, why not simply give all such technically correct, but average work what it should earn: a C? AI produces good, boring, safe prose. Many students who pair moderate talent with immoderate anxiety already do that. I never give them an A. For years I've said "this piece takes no intellectual risks. You are writing to please me, not learn something new."

In Nancy Sommers' excellent short films from the 1990s about writers at Harvard, I recall how one fourth-year student said that he learned to begin with what he did not know, starting with a question. This remark changed how I taught writing. I'm going to press my current adult students to do the same: begin with what you DON'T know. As Longinus warns us, "A world-wide sterility of utterance has come upon our life."

In discussion with ChatGPT 5 recently, I asked it about the role of human labor in a a time when automation already takes some entry-level jobs. It replied, summing up a short list of human skills essential to success, "the future may need fewer button-pushers and more meaning-makers."

Good writing sways us, it shows us the meaning past the words. It says with us, like the remark by that Harvard student. So this term, I'm asking more, not less, of my writers even as all of them use AI in their assignments. The machine as raised the bar on what constitutes excellence.

image: Raphael's The School of Athens (Wikipedia)

 

Thursday, August 21, 2025

A Stunning Lack of Search Results From Universities

AI generated image of a campus building

As a staunch supporter of an open Internet, where advice can be freely shared, I'm stumped by the lack of results form a few simple Duck Duck Go queries:

  • University sources Podcast Scripting for AI 
  • College guides Podcast scripting for AI 

Fordham's page on using ElevenLabs proved the only hit that was not from some start-up trying to hawk AI services. This outcome discourages me greatly. Here's why.

Universities, even ones under the hammer of Federal and state funding cuts, have robust I.T. presences.  Internal materials for faculty and students can be shared with no more than a few high-level links. To me the lack of materials portends either a lack of care about reaching beyond the campus gates (a sad lack of marketing savvy) or, more ominously, that colleges and universities are as ill-prepared as Ian Bogost claims in his recent Atlantic piece, "College Students Have Already Changed Forever." Bogost notes that "plenty of professors are oblivious. It isn’t that they fail to understand the nature of the threat to classroom practice. But my recent interviews with colleagues have led me to believe that, on the whole, faculty simply fail to grasp the immediacy of the problem. Many seem unaware of how utterly normal AI has become for students. For them, the coming year could provide a painful revelation."

One would think that schools such as Washington U at St. Louis, where Bogost is on the faculty, would do a great deal, as Ohio State has begun with its recent initiative. I found this page on AI resources at Washington University, but you must scroll to the bottom to get resources for faculty. Mostly, the page emphasizes new degree programs for students. Digging in a bit, I found this page for teaching activities, a page on an AI summer institute, and some philosophical advice about how to design assignments. Why don't such in-house materials appear higher in search results than those of peddlars of their software solutions? "They didn't give us any money" would be the answer for Google, but for Duck Duck Go, it seems dubious.

My hunch is that our schools are so overwhelmed that "sharing the wisdom" remains a very low priority. 

Luckily and of more use to faculty, The WAC Clearinghouse has launched an initiative to build a library of open-access lesson plans for AI. I'm still not sure why universities have left it to a talented group working across our institutional boundaries to do this. I'd claim that Generative AI poses the largest challenge to how we educate in a long time, categorically different from the coming of the Internet.

image: a really lame prompt and result, "Impressionistic image of traditional campus Humanities building with students going to class," given to Dall-E, when I was first using the image generator.