A recent article?in?Times Higher Education on the dire state of the UK job market in English studies is just the latest in a lamenting the plight of junior humanities scholars on both sides of the Atlantic.
The article takes to task the senior scholars who tell precariously employed juniors to just hang on and wait for the supposedly inevitable permanent position to open, accusing the former of being out of touch with what are assumed to be historically low odds of landing a permanent position in an era that is assumed to have a historically poor appreciation for what the humanities offer.
It is true that those who land permanent jobs are often blissfully ignorant of the long odds. But I have heard similar complaints about the decline of the humanities since I was a student in the 1960s. The supposed lost golden age never existed.
I graduated in 1970, but the intellectual emphases of the?1960s were already clashing with pressures to major in business, engineering, pre-law or pre-med. I chose to defy my parents and do a PhD in history rather than attend law school despite knowing that an academic “jobs crisis” had persisted through much of the 1950s and that the 1960s boom, driven by expanding enrolments, had weakened dramatically. Faculty advisers were open about the gap between numbers of graduating scholars and posted jobs, particularly in the humanities.
Sure enough, when I received my doctorate in 1975, I faced a dire jobs market. I am still awaiting rejection letters for positions I applied for then, as well as in 1980 and even later.
But I was lucky. A new public institution, the University of Texas at Dallas, was hiring more than 120 new faculty in the arts, humanities and social sciences, a condition of the legislation that had converted the former research centre for Texas Instruments into an initially science-only university less than a decade earlier.
Most of those hired were fellow new PhDs, a handful with one or two years of postgraduate teaching experience. Few of us even visited the campus before relocating; I was hired after an interview in a hotel room at Toronto airport with the founding dean. Given the lack of positions elsewhere, we didn’t have much choice – and the university found itself the employer of an extraordinarily talented band of scholars. With tongue only partly in cheek, one Princeton economist observed: “Aren’t we all someone famous’ best student?”
By contrast, all but one or two of the handful of tenured professors among the founding faculty had been denied tenure at previous institutions. Conflicts of rank, generation, talent and attitude were acute. Not surprisingly, the new PhDs were both more suited to and more enthusiastic about the newest university on the block’s purported interdisciplinary orientation – another concept that has a much longer history than its modern rediscoverers suppose.
But this commitment to interdisciplinarity was only slogan-deep. A handful of us worked across the university, but it did not go well. As a quantitative social scientist and “new” historian, I was hired by arts and humanities but initially housed with social sciences, as I wished. But at the end of the first year, the provost ordered everyone to “return to where you are budgeted” for the accountants’ convenience.
In reality, the administrators – most of whom had limited qualifications and experience for the task at hand – embraced interdisciplinarity only as a budget-saving measure, obviating the need for departments with chairs, offices, staff and funding for separate programmes. The rhetorically misnamed “neoliberal university” actually came into being at the end of?the Second World War,?not in the 1970s, 1980s, 1990s or 2000s, as a .
No one in charge had any conception of the founding student populations, either. These largely consisted of military veterans and college dropouts, especially women returning to college after their children grew up or first marriages ended. Not surprisingly, then, the gap between course offerings and student interests and understanding was massive. It is no exaggeration to say that the three hired ethnomusicologists outnumbered the number of students who knew what the word meant.
Most of us were younger than our students, too, which didn’t always make it easy to convey scholarly authority. I realise that younger scholars today cannot dream of landing a tenure-track job at the age of 26, but our employment was not at all secure.
Mandatory third-year “probational reviews” were a massacre; few of us were informed beforehand that the Texas state system allowed an assistant professor to be terminated without a full review before the end of three years. Some colleagues were fired because they intimidated their “senior” colleagues, others (including the musicologists) because their courses did not attract enough students.
Some of those dismissed found satisfying positions at universities elsewhere. Others dropped out of academia. But almost every one of those with whom I kept in contact found successful ways to use their knowledge and skills more or less directly, in fields such as philanthropy, congressional research and non-profit advocacy.
Understanding their paths should inform any efforts to rethink graduate recruitment, education and preparation for a range of careers.
Harvey J. Graff is professor emeritus of English and history at The Ohio State University and inaugural Ohio Eminent Scholar in Literacy Studies. This essay is part of a book-length project, Reconstructing the ‘Uni-versity’ for the 21st Century from the Ashes of the Multi- or Mega-versity.