5 Comments
author

I'm with you most of the way here. I agree that the standard is not "have an original thought", at least not until a student has committed to being a fully professionalized scholar who has to make some claim to pushing the state of knowledge forward. But the point is that our pedagogy often is considering student work in relationship to the thoughts that have been thought--sometimes accidentally, sometimes on purpose.

So yes, sometimes we really are able to have a kind of clean room where we all read a significant work where all thoughts are welcome and it doesn't matter if they've been thought before. I had a marvelous experience this last fall with a number of discussions like that--some students who'd never read Genesis before and had no real familiarity with Christianity who had some really interesting initial responses to what they were reading, for example. Since we weren't reading it in a class *on* Christianity or scripture, I didn't need to hurry students along towards knowledge about theological knowledge.

But a lot of the time, we're eventually going to be trying to synchronize what students write and think with what has been written and thought. Imagine trying to go into a 'clean room' class to think from first principles about kinship, where all thoughts were welcome and none were burdened by needing to be either faithfully referenced to existing knowledge or original arguments about existing knowledge. Where would we start? First, perhaps, from personal experience. We could share testimonies: this is what family and kin relation means to us. This is what we hear other people saying. We could begin to work outside of that from source material first: kinship in literature, in religious scripture, in laws and policies, in popular culture. But at some point, if the goal at the end of 14 weeks is for students in the class to be conversant with the way kinship and family have meaning in history, sociology and anthropology, I'm going to have to cross to the scholarship, and at that point I'm going to be producing a history of what the terms have meant and how they've been worked and an account of how that history has led to present projects and interests. Knowing that this is where I'm going produces a pedagogical pressure on me that will intrude on the open-endedness of the beginning of the course and a foreknowledge on the part of the students that there are going to be some "right" answers in the back half of the course that the first half has to lead towards, which is going to make them reluctant to think from first principles in an unguarded way. It may also make everyone end up feeling: well, why did we muck around with thinking this through as if there is not already a lot of thinking going on? I think it would be really valuable to do so--that's my friend's proposition about teaching biology--but I think it's hard for many people to embrace doing it.

Expand full comment
May 18, 2023Liked by Timothy Burke

Does it matter if we think of something already thought before, so long as we thought of it and it’s a good or an interesting thought? Trying to be absolutely original in a world with thousands of years of thinking (including all the people who thought but wouldn’t or couldn’t write down what they thought) is not a viable endeavor. What I wanted my students to do was to experience the joy of thinking—not to my destination but to theirs, whatever that might be. How does AI help them to make that effort? Not convinced.

Expand full comment

15-year-old boys attending Adam Smith’s Lectures on Jurisprudence* at the University of Glasgow in 1763 were not expected to produce anything original. Indeed, the very idea of originality would have been considered quite presumptuous on their part.

They were expected to get up at least a smattering of Greek and Latin, in order to produce a plausible imitation of the ancients.

*Smith on historical jurisprudence really is quite brilliant, no matter what you think of his 'invisible hand'

Expand full comment

I like the ambition of a course---or a thematically coordinated semester of interdisciplinary courses---devoted to different ways of talking 'historically' and contemporaneously and, perhaps, in futurity about 'Why awful things happen to good people' or 'Why do we say the sun rises every morning? Why does it still make sense to us? Why do we assume it's certain?'. I think what's hard is the encounter with still-open, imponderable questions. That's the virtue in teaching absurdly anachronistic substantivist logic: If the floating thing isn't wood, it's 'A duck!" or 'King, because God made him our divinely ordained Father and Vicegerent (and, as a manifest sign of his Election, he's not covered with 'sh**' like the rest of us. And he has huge tracts of land.'). The main problem I foresee with the approach is that even inspired, inquisitive students want to feel like they go away from their big academic excursions with solid takeaways. Methods, approaches, ways of seeing, if they're courageous enough not to insist on lots of GRE-bits of bankable knowledge.

And *that's* the misery, now as it has been. Parents, prospective employers, legislators, and administrators all want the minted validation of many procedural, testable, numerical proofs that students came away with factual dividends from the substantial pedagogical investments. In their _New Rhetoric_, Perelman and Olbrecht-Tyteca address the community-endorsed sovereignty of teachers over students within a page of explaining how popular demagogues operate. I worry that the university experience will become a life-determining experience of working with occupational advisors about how to tweak AI-prompts to suit different professional tasks and audiences. What does it matter to a kid who gets a wild idea about what Adam Smith is suggesting in one of his essays on moral sentiments if she's told all she has to do is frame parsing of a paragraph in a certain way, and the Chat will give her all possible permutations of possible meanings, only suggestively ranked by prevalence.

Expand full comment