When I first heard the term “alt-ac,” I thought of supplementary insurance and spokesducks. It actually stands for “alternative academic” careers, but the comparison wasn’t too far off, really: For new Ph.D.s and those just now limping to the ends of their dissertations, many disciplines just released their first (and “largest”) crop of job ads for open professorships beginning in fall 2015. In a lot of fields, almost all open tenure-track positions for next fall will be advertised in the next few weeks. And this year’s minuscule list shows that if any profession needs supplementary insurance, it’s academia: Future prospects for the profession of professordom have gone from bleak to cadaverous.
Hence alt-ac, in which individuals with doctorates work not as university professors, but in archives, libraries, think tanks, nonprofits, museums, historical societies, “journalism” (ahem)—even within academic departments as, for example, digital technologies specialists. The idea is that “alt-academics” use their knowledge and skill set—the ability to research, write, teach, present, or even just be smart and diligent—to, you know, do a job.
How is this, you might be wondering, in any way noteworthy? Isn’t this the way the world works for most people? Well, the dirty secret of academia is the shame that is drilled into many doctoral candidates at the very notion of working outside the academy. In many disciplines—most prominently in the humanities—the holder of a remunerative job, if that job is outside of a tenured professorship in a research university, is still viewed from within the culture as an abject failure.
For those of you outside academia—or in disciplines where Ph.D.s working in the private or nonprofit sectors are commonplace—you might again be shaking your heads: How can Ph.D.s finding jobs be remotely controversial? Even though I have a background in the humanities—and I, at the conclusion of my fourth unsuccessful postdoctoral job-search year, felt like an acute and abject failure—I, too, recognize the alt-ac stigma as ass-backward.
It’s a shameful carryover from the good old days of the Ivory Tower, I guess, like elbow patches and the seduction of students. To this day the success of a tenured research professor is judged in large part on his or her graduate “placement rate,” where “placement” means replication, in a tenure-track professorship, preferably at an institution with a graduate program, ad infinitum. What could possibly be wrong with this system?
To their credit, quite a few senior professors find this system abhorrent. Michael Bérubé is a full professor at Penn State and former president of the Modern Language Association, and at a recent alt-ac-themed symposium (at which he and I were both invited panelists), he recalled the brouhaha when one of his former advisees decided to leave her tenure-track job—which, he said, she “despised”—to teach in Chicago public schools. Bérubé thought it was “great,” but his colleagues were aghast at the damage her career choice would do to his placement rate. “It’s not about my placement rate!” he remembered saying. “It’s not about me. It’s about this person’s life.”
Interestingly enough, Bérubé said, the anti-alt-ac “pushback” can also come from grad students themselves, many of whom very badly want the 10 or so years they’ve spent (often at great sacrifice) in service of “the institution” (as academics so tellingly call it) to result in that institution deigning to let them join it. In 1998 the mild suggestion, to the Graduate Student Caucus of the MLA, that Ph.D.s consider careers outside the tenure track was seen, he said, as “a betrayal.” So, too, was the American Historical Association’s declaration that careers in museums, historical societies, and nonprofits no longer be considered “Plan B.”
However, while the scowling disapproval of a Ph.D.’s graduate cohort or “doctor-fathers” (as my discipline, again tellingly, calls dissertation committees) may discourage career exploration, that poor schmuck still has to house, feed, and clothe herself, no matter how many letters she has after her name. And increasing numbers of Ph.D.s are realizing that the now-standard career track—adjuncting, often for less than $20,000 per year—is, in all senses of the word, a poor way to go.
As much as we well-meaning souls counsel against going to graduate school altogether, the fact remains that scores of budding scholars simply will not be dissuaded. And although it’s still quite rare, some institutions are finding ways to prepare these scholars for employment, albeit not the employment they initially expected or wanted. For example, Penn State—whose Center for American Literary Studies hosted the alt-ac symposium—has started a graduate internship program, run by Christopher Long, a philosophy professor and associate dean.
This program places Ph.D. students in paid non-teaching, non-research internships in offices across the university so that when they tiptoe onto the carcass-strewn job market, they will be in possession of the one thing that many of their competitors lack: work experience doing anything. “It’s much easier to get a job working in the Office of Undergraduate Admission if you’ve worked in that office—or an office—before,” explained Brian Croxall, a recent Emory University Ph.D. in English who now works in that school’s library as a digital humanities strategist. Yes, to a normal person this is painfully, painfully obvious—but you have to understand that in some areas of academia it’s anathema.
Instead, the backward and utterly perplexing way to “prepare” humanities Ph.D.s for the working world has, until now, been to keep them (often by mandate) from amassing any work experience not directly related to research and teaching—indeed, it’s been to scoff openly at said experience as being unworthy or distracting. “I wasn’t supposed to have an outside job,” explained Croxall of the strings that came attached to his $12,000 annual graduate stipend (and indeed, my own stipend, a comparatively plush $15,500, stipulated the same). “Graduate school was my job,” he said, “so—no moonlighting, please.” This is a standard expectation across the United States, though I don’t know a single grad student who hasn’t tutored, tended bar, or sold plasma to make rent.
The academic aversion to real-world work experience has even leaked into the professoriate itself, with stellar job candidates with years of relevant experience as adjuncts being dismissed outright precisely because of this experience, in favor of brand-spanking-new Ph.D.s with boundless “potential.” Indeed, academia is the only profession I can think of (besides, perhaps, the world’s oldest) in which experience counts against you.
And thus the transition into a world where—thankfully—the opposite is true can be downright foreign to academics at all levels. Some aspiring alt-academics at the symposium explained that they’d been applying for alt-ac positions for years and weren’t able to land one—but they’d been applying in the only way they knew how: sending in a shiny CV and hoping their “potential” spoke for itself. In the alt-ac universe, you enter a career by having serendipitous encounters, making fortuitous connections, or taking on small, part-time contract work and proving yourself—like a normal person.
And yes, I realize that to a normal person this is mind-bendingly obvious-sounding. But to academics, it’s not, because they have been socialized in the opposite direction. If you have always wondered why your 12th-year philosophy grad student cousin seems a little off, living in a completely backward universe is but one of the many reasons why. Will the continuing cratering of the professoriate as a viable career (which now, make no mistake, is by no means limited to the humanities) usher in an alt-ac revolution? All it would take is for academics, even the old-guard tenured, to embrace the notion that a job is a goddamned job.