This article is part of Future Tense, which is a partnership of Slate, the New America Foundation, and Arizona State University. On Wednesday, April 30, Future Tense will host an event in Washington, D.C., on technology and the future of higher education. For more information and to RSVP, visit the New America Foundation website.
It’s no secret that the humanities—literature, history, philosophy, the foreign languages— are suffering from a precipitous plummet in higher education. But hark! Digital humanities are here to rescue the field—or maybe just kill it off for good.
Some 10 percent of humanities scholars currently self-identify as digital humanists, which is either an alarmingly large encroachment or a too-modest development, depending whom you ask. As such, digital humanities is the consummate academic hot-button topic: Everyone has vehement opinions, but few actually know what they’re talking about.
So what is “DH,” as the academic cool kids call it (and yes, “academic cool kids” is a misnomer)? Should everyone writing a Chaucer dissertation learn how to code, and if so, why? Will DH be the Facebook of the academy—or its Pets.com?
The field itself isn’t actually new. According to Roopika Risam, assistant professor of English at Salem State University and co-founder (with Richard Stockton College assistant professor Adeline Koh) of the project Postcolonial Digital Humanities, it is the current incarnation of humanities computing, which has been around since computers were the size of a room.* Although the definition of DH is contested, the field basically covers three main areas:
First, digital humanists use computational technology as a research tool. Sometimes, digital humanities is as simple as putting an an old text online for everyone to see—like this scan of a book by the obscure language philosopher Fritz Mauthner, which saved me about a month of headaches at the library back when I was writing my dissertation. Or take historian Michelle Moravec, an associate professor at Rosemont College who works “with machines in ways and on scales that my brain cannot” to analyze, for example, how suffragists talked about gender.
Second, DH makes contemporary humanities research publicly available. This is quite the little act of rebellion, as traditional scholarship is limited to a handful of staggeringly expensive journals with laughably small readerships.
And, finally, some digital humanists also study the relationship between culture and technology as a primary source. Jesse Stommel, an assistant professor at the University of Wisconsin–Madison and director of Hybrid Pedagogy, calls it “using humanities tools to investigate” technological issues. Think applying Judith Butler’s gender theory or philosophy to, say, the #aftersex Instagram trend (and I call dibs on that paper, by the way).
But, warns Koh, “Digital humanists shouldn’t try to be computer scientists” just to seem relevant in today’s tech-obsessed academy. And indeed, it would be tragic—and probably not actually effective—if every English department in the country forsook the classics for coding. As Moravec puts it: “The idea that if somehow humanities can become more ‘like’ the sciences we will be fabulous forever is an absurd fallacy.” What makes sense, she says, is using digital technology to help people outside the humanities understand the relevance of what, exactly, it is we do all day. That is, to “disrupt” the academic status quo.
Traditionalists always fear that the latest humanities vogue will defile poor Shakespeare’s corpse. To be sure, for academics, DH does bear some resemblances to Silicon Valley, which many of us regard, in its irascible Ayn Randiness, with suspicion.
But is being “the Silicon Valley of academia” really a bad thing? Perhaps academics can—gasp—learn a thing or two from the startup world. Like, for example, teamwork. “Academics,” says Moravec, “are taught to be individualists and highly competitive,” with the result being that most scholars work in secret until they foist a finished product upon their three-person audience. Scholars “treat ideas like currency and hoard them,” to deleterious results. Compare that, then, to an open feedback loop that leads to “productive failure, experimentation, and innovation”—all of which, Moravac says, are beginning to take root in DH, but still far more common in Silicon Valley, where failure is often a career starter rather than ender.
That said, just as the startup world can get lost in its own delusions of invincibility, so should the digital humanities seek to avoid just that—which shouldn’t be too difficult, given that DH culture possesses one quality Silicon Valley could do well to emulate: thinking about its own weaknesses all the time, meta-analyzing what it is actually doing.
Risam warns would-be bandwagoneers not to dump literature: Actual digital humanities jobs on the tenure track are few—more common, she says, is a “secondary specialization” in DH thrown in to your more traditional academic job ad. The real advantage of training in the digital humanities, she argues, is helping students look “beyond the tenure track job,” possibly outside of academia altogether.
Another way of putting that is: Do not spend eight years getting a doctorate with the sole purpose of becoming digital humanist, as you would be better off just learning to code and getting a job as a software engineer. However, if you have already made the unwise choice to enroll in a humanities Ph.D. program, one way to salvage what will otherwise be your eventual entrée onto a jobless hellscape might be to “disrupt” your Eliot (George, T.S., whichever) and start using technology to analyze, distribute, or supplement your research. The worst possible outcome, after all, will be that more than three people read your work.
Correction, April 16, 2014: This article incorrectly referred to Postcolonial Digital Humanities as a journal. It is a project. (Return.)