The wheel turns again, as it has almost every August of my life. Once again, back to school.
And once again, I find myself, as every aging person does, mystified by change despite the fact that I am professionally trained to research change over time.
It is all the more mystifying at times when so much in a life is held constant. I’m getting close to six decades of life on this planet. I have been married for nearly forty years to the same person. I have been studying history as a subject since I was an undergraduate and been intellectually interested in southern Africa since I was in high school. I have been focused on developing my ability as a public speaker since I did competitive debate and speech events in high school, or arguably earlier in terms of a family culture that privileged competitive conversation and debate. I taught my first class as a teenager (to a group of pre-teens, about how to play Dungeons & Dragons). I’ve been an avid reader since I was first able to read. I’ve collected comic books since I was eight and I still buy them. I’ve played video and computer games avidly for the entire existence of that media form: the only significant computer game that predates my history of playing them is Spacewar! and not by much.
I can readily explain the empirical details of many of the changes I’ve seen. Many of them I quite like. Playing Baldur’s Gate 3 feels like I am living the good life when I compare it with something like Wizardry. I know why my waistline has grown and my knees ache. I know what I still feel confident about and what has come to make me anxious and why. (Mostly; what remains is less history and more psychology.)
I’ve been working at the same institution for thirty years, in much the same role. I find myself this fall once again chairing my department (thankfully, co-chairing this time, hopefully only for the next three semesters) and serving on a frequent-meeting elected faculty committee. It’s that working constant that feels the most full of mystery to me, not just locally but across all the institutions in my profession.
I just had an article accepted for publication, which excites me—it was energizing to write and the editors were lovely and helpful. The extremely formal final submission process included an option to make the article open-access. If I was willing to pay the publisher $2,500, or roughly half a dollar per word. How it happened that we overcame the extensive overhead costs of print and yet ended up in a situation where universities and colleges subsidize faculty to produce knowledge which is then given away for free to publishers who then make considerable money on selling it back to the universities while depending on the free labor of peer review to make it valuable while restricting the dissemination of knowledge whose authors only benefit from the widest dissemination possible is actually something of a mystery to me. I mean, I can recount the history exactly and I have conventional explanations and causes for it. But why academia was unable to engage in a completely obvious and totally plausible sort of collective action to create digital platforms via a massive consortial infrastructure that would completely cut out the expensive middleman? I don’t know. I only know that the opportunity is now long in the rear-view mirror.
I am far more mystified at the more pervasive formalization, stratification and convolution of academic life. I can trace certain changes in the labor model dominant in academia easily enough, and they’re explained, depressingly, by the spread of a certain kind of extractive austerity not just in academia but across the entire world—in the case of academia leading to the elimination of many tenure-track jobs in favor of poorly compensated or insecure contingent teaching labor.
But other changes that are tangible, that I always can feel and sense most acutely at this moment of the year, as the cycle begins anew? They baffle me. I don’t even know that the people making the changes are aware that they are changes. Sometimes they even deny that anything has changed, and at least sometimes they’re sincere about it. You lose continuity and memory quickly in some parts of a university or college: all it takes is a turnover of all the staff in one office, or a generational cohort in a department to retire all at once, and poof! it’s all attack ships on fire off the shoulders of Orion, tears in the rain.
But it’s even the motivelessness of change that is a change. I’m not imagining that changes were explained and rationalized far more thoroughly when I first started my work as a professor—I dug out some of the oldest layer of emails and memos that I’ve saved, now more than 20 years old. (Historians are natural pack rats.) I’m not imagining that faculty were once seen as stewards of the whole institution and were trusted in that role, that once upon a time leadership thought “Hey, a bunch of people with Ph.Ds, they’ll have some good ideas about this new thing we want to do and they’ll add some important data or information to it”. Somehow stewardship became partnership became participation became stakeholding became human capital became talent retention became something to manage and nudge and constrain became something to keep away from most decisions.
I can trace how it happened and when it happened but I’m not sure I have any understanding of why. I don’t think it’s austerity and I don’t think it’s some kind of naturalized will-to-power. I can’t rationalize it easily in terms of money or authority. I don’t know if this is about solving a problem in someone’s eyes because I’m not clear what the problem is or was in the minds of people introducing changes. A lot of things used to work pretty damn well when they were informal and implicit, and work less well in various ways when they’re not. (More labor time, less optimal outcomes, less clarity despite the formality, and anything outside the procedure becomes illicit and indescribable despite still determining most outcomes.)
Maybe it’s thought that consistency and formality were the road to equity and diversity? If so, that’s not how it turned out, for the most part, but I also don’t think of some of the moments in this shift as coming from people who seemed to have that goal in mind. Maybe it’s just imitating organizational cultures that come from somewhere else? But most of the evidence I’d need as a historian and as a person in this world is absent, invisible, locked up tightly. I’d need to have been there when a person from another context turned to a person with authority inside this context and said “How is it that you are still doing X here?” and heard the reply “I don’t know, we should change that.”
Years ago, I met a former president of my undergraduate institution at a dinner and he was frank and voluble about how things were and why decisions were made back then. I met a former president of my current employer once at another dinner and he was the same, full of rich and honest insights into how the institution traversed his time in leadership. I travelled a while back in a group that included a president of a small liberal-arts college who gave me great insights into whys and wherefores of the decisions they’d made. On the other hand, I have a professional friend that I worked with a great deal when she was a professor who was a frank and keen observer of academic process who moved into academic leadership. In stages, she disappeared as an accessible, speaking subject; busy, yes, far away, yes, but I suspect that even if I were a next-door neighbor who was the soul of discretion, I’d find that the world she now operates in has a profoundly structured opacity that no individual character can override or circumnavigate. I feel as if across all of higher education, the door has been slammed shut in the face of faculty, students and mid-ranking staff: no discussions, no explanations, just announcements and abstracted, formless, distanced management of a workforce or a customer base.
It is when change seems unnecessary that older people feel most disoriented, alienated and angry. When it seems like deprivation and degradation that is on purpose but to no explained end, when it feels like things are being taken away and new demands are being made for no acknowledged or rationalized reason. When information and clarity concerning changes and decisions are seen as risks to be avoided. When I can’t excavate and analyze change either with my professional skills or my lived-in knowledge, I feel a kind of agitation and confusion, an unwanted recursion to the toddler’s incessant question: “But why?”
Courtesy of Paul Musgrave’s Substack Systematic Hatreds, I read with great interest another Substack essay by Anton Howes about the persistence of historical myths in historical scholarship. And it did make me think a little bit about the dangers of my feelings about change in what I have imagined to be a steady and unconstraining life.
Howes points out, using a number of examples, that many historians uncritically repeat as fact something that turns out to have no factual or evidentiary base when you trace it back to its original appearance.
He’s completely right, and I can think of many examples I’ve run into in my own career. At one point, I myself repeated the idea that a film audience in Paris ran in fright out of a theater when viewing a film where a locomotive comes towards the camera position. It never happened, or at least there’s no evidence to suggest that it did. That the story has been told persistently is evidence of something historically important—that modern audiences were trying to process the mimetic power of film while also complimenting their own sagacity as viewers, a point I made in an essay on colonial cinema in British-controlled Africa.1 But that requires firmly processing the myth of the French audience as such, and it’s been a hard story to fully kill off. I see it flare up now and again as historical fact.
More subtly, I’ve been struck in repeated engagements with Frederick Lugard’s (in)famous Dual Mandate in British Tropical Africa that many scholars who’ve referenced it in the last forty years have taken for granted that it was an accurate description of the intent and character of the British administrative state in imperial Africa and that Lugard had been an important builder of those structures. But dig in deep and there’s a lot that doesn’t work about that reading: Lugard was never as important as he took himself to be in Nigeria; multiple subordinates of his reported in print that Lugard was a humbug who excelled mostly at taking credit for things that other people did and then at having an acolyte who did a marvelous job overselling the magnitude of his influence. Lugard’s account is full of contradictions, and not merely in the famously convoluted final chapter, and yet it also doesn’t describe at all the intensely contradictory nature of British approaches to colonial administration right under his nose, in Nigeria itself. Lugard’s account isn’t a plan or a proposal: it’s describing what had already happened, much of it elsewhere and without his input or involvement. When I read the book with students years ago in a class where we were testing approaches to annotating primary sources (way pre-Perusall or Hypothes.is), they were all dismayed on reading some scholarly writing that cited the text because they realized that much
While I’m not a huge fan of the entire historiographical infrastructure that was built up around counting the numbers of people taken from sub-Saharan Africa into the Atlantic slave trade, the genesis of Philip Curtin’s initial entry in that subfield2 provides another pretty good example of this sort of myth-uncovering: Curtin went to look for the source of common estimates about the scale of the Atlantic slave trade that were in circulation in scholarship in the 1950s and early 1960s and found that they all traced back to what was basically a random guess, based on no research at all, which then impelled him to do a lot of research to repair that oversight, which in turn spurred a lot of detailed work to challenge and extend his numbers.
There are subfields of history where this kind of thing is especially common—or perhaps more accurately, bodies of common historical understanding that are largely unaffected by huge changes in the scholarship since those understandings took hold or common historical understandings that really don’t understand just how thin an evidentiary base they rest upon. A tremendous amount of Roman history depends on texts that were written decades or centuries after the events and personalities they describe by writers who had various axes to grind/asses to kiss, which I don’t think the general public realizes. Graeber and Wengrow’s Dawn of Everything I think does a good job of rounding up a large number of venerable interpretations of early human history that are based on incredibly thin or refuted evidence.
To bring this back to that melancholy crying in my beer about change, the danger of those feelings (mine and others) is that they’re based in some cases on myth. It’s why I feel the need to root around in my own archive some to reground my impressions. Sometimes when you check with other people, you find that they experienced a completely different sort of institution. I know that many women of my generation and the generation that preceded us at my college and many others found the institution as it was when they arrived to be stuffy, confining, menacing or overtly discriminatory, and the more regulated and formalist institution that has developed afterwards at least offers some kind of structural mechanism for recognizing problems and remediating them. E.g., they feel that informality was the cloak that let many men get away with all sorts of bullshit.
Sometimes you find that you had information not because it was being generally shared with everyone but just because you were nosy—or in my own case, sometimes just because I’m a good guesser and could use inference to suggest an interpretation of a new procedure that then met with surprised confirmation from someone in the know.
I’ve certainly run into faculty here and at many other institutions over the years who hold tight to a determined version of the kind of myths that Howes is talking to, a sort of folk understanding of what happened or why it happened that is simply wrong. More than a decade ago, I was sitting with colleagues talking when one of them bitterly described a decision that he didn’t know I’d been involved with and his description of the meeting where it was made was entirely fictional in every way possible. Surprised (it was an atypical burst of anger for this person), I asked where that had come from. Well, he’d heard it from this person (who had not been in the meeting) who had heard it from that person (who had not been in the meeting). When I happened to talk to That Person, they said they thought they’d heard it from me but couldn’t remember when—and That Person was someone I ordinarily never had conversations with and had never said anything to about the decision or the meeting. Well, said That Person, I must have heard it from someone else. And that’s as far as we could go with this.
The problem as always is that if you brutally dispense with anything that might be mythical—that is, in Howes’ sense, not replicable—you end up dumping a lot of historical knowledge but you also end up with a kind of dully positivist approach to the archive where you limit yourself only to what is explicit, detailed and documented and eschew anything that takes interpretation or inference. (It’s telling that Howes is focused on quantities, where absurd exaggerations and inventions may be easier to demonstrate, and not on much fuzzier kinds of information and evidence.)
In your own life, you also end up vulnerable to a kind of pervasive gaslighting, unable to say anything that isn’t transmitted in writing or contained in a report (and official communications and reports are, if you’ve been on the inside, often fictions of a particular kind, leaving out a great deal) or that you didn’t experience yourself. (And if you did experience it behind the almost omnipresent veil of confidentiality, you can’t say anything even if you do know what actually happened.) This happens in families, in neighborhoods, and in institutions, even institutions supposedly dedicated to inquiry and truth. You can’t ruthlessly purge anything potentially mythic, and yet you have to embrace a personal skepticism about personal knowledge. Things change, but the inclination to see change as loss and deprivation is as much rooted in the inevitable decline of our bodies and minds as it is in what has really happened in the world. Somehow you have to find a way between being perpetually outraged by a sense of being victimized and being perpetually bamboozled when you keep being told that everything is just as has always been. I don’t think I’m saying anything radical when I suggest that one cure for that is not on the part of the person trying to find the balance between the two but instead in increased transparency and accountability from people involved in making changes. It gets a lot easier to have a focused, discriminate and proportional conversation about changes, necessary or otherwise, when you get rid of vast informational asymmetries.
Burke, Timothy. “‘Our Mosquitoes Are Not so Big’: Images and Modernity in Zimbabwe.” Images and Empires. United States: University of California Press, 2002.
Curtin, Philip D. The Atlantic Slave Trade : a Census. Madison: University of Wisconsin Press, 1969.
To the question: "But why academia was unable to engage in a completely obvious and totally plausible sort of collective action to create digital platforms via a massive consortial infrastructure that would completely cut out the expensive middleman?": physics more or less has, with what is now arXiv.org, which is now 32 years old, having been launched contemporaneously and independently of the web. These pieces by its creator Paul Ginsparg marking the 20th (https://www.nature.com/articles/476145a) and 30th (https://www.nature.com/articles/s42254-021-00360-z) anniversaries offer some insight into the change that has been achieved and the unexpected challenges of running such a service.
Of course it hasn't killed the journals, of which there are several more than when arXiv began. One thing that struck me the last time I looked at the financial part of the American Physical Society's annual report is that its journals (which are high-prestige) are the only money-maker in the budget: the conferences it runs are more-or-less break-even, and membership dues don't even cover the regular member services functions. But the publications bring in about $11M more per year than they cost to produce. So there is a way of looking at it in which the APS has arranged for university library budgets to subsidize e.g. its advocacy and outreach activities.
There hasn't been a similar open preprint archive in chemistry, which would nominally seem to be a similar discipline. In fact, the chemistry journals are historically far more restrictive about sharing articles and preprints. The main difference, as I understand it, it that there is no physics industry the way there is a chemical industry, which pays whatever the American Chemical Society asks for its journals.
"I only know that the opportunity is now long in the rear-view mirror. "
I, like Tom, came here to say that this is wrong. The good fight on this issue is now. bioRxiv is relatively new, for example, and so are things like EU open access mandates and efforts to create things like overlay journals and other efforts to not just produce freely available scholarship but eliminate the absurd overhead that journals extract from us.