Lucas Matney, a junior at Northwestern University and columnist for the Daily Northwestern, is concerned that his school is not adequately preparing him for the challenges of today. In his experience, he says, “very few” of his professors “have used technology in the classroom in a way that offers a radically changed educational experience.” Instructors like me and my friends, he argues, “need to ponder what a 21st century education really means if they desire the University to maintain relevance as an institution.” Trust me, Lucas, most every professor younger than 75 (and a few older!) has pondered this question a-plenty. In the words of the great rhetorician Ali G: “Techmology … is it good, or is it wack?”
As someone who routinely teaches paperless courses and has made regular use of class wikis, discussion boards, and of course my great nemesis PowerPoint, I still tend toward wack.
I see where concerned students like Matney are coming from, and some of the ideas he presented when I pressed him for specifics in a follow-up email were quite compelling: He suggested videoconferencing with the scholars who write the academic articles he’s assigned, for example. But as someone who has done a lot of work with teaching technology (one of my grad-school pedagogy projects was to develop a bookless second-year German curriculum based on online “modules”), I’ve still come to the conclusion that unless you’re teaching a course actually on technology—digital humanities, computer science, engineering, fascinating-sounding stunt courses at Penn, etc.—what our students could really use is some time unplugged. That professor you think is a lazy Luddite might actually be doing you a favor.
That’s not to say we should return to the glory days of the overhead projector, nor should we ditch the ubiquitous Blackboard and return to the dreaded paper grade book (remember those things? Egad). Granted, Blackboard’s interface can be a cumbersome nightmare, especially as it encroaches all the way down into kindergarten—but online syllabi, grades, and assignments are practical, and here to stay.
Sure, every department has one stubborn coot who still uses a typewriter (or, worse, WordPerfect). I’m not talking about him. What I’m talking about is the notion that it’s a college professor’s responsibility to kowtow to technological fads, most of which will be obsolete in six months’ time. You know what won’t be obsolete, though? Bertrand Russell. Alice Walker. Variable calculus. Don’t make us teach these things with a wiki and a videoconference just because we can.
There are wonderful scholars (such as digital humanists) who incorporate technology seamlessly. But pity the rest of us, who are also forced to embrace “teaching technology” for its own sake, just to appease some dean or make our dossiers competitive on the job market. I once took over a class from a beloved colleague who gave students an assignment to “blog in character”—great idea, but one in which the blogging aspect served no practical purpose other than to force the professor to play tech support as students tried to figure out Blogspot. In fact, to many students the assignment felt weirdly anachronistic, given that all these “characters” lived in 1910. In my version of the course, students sampled the lost art of journaling instead, which they enjoyed greatly and which I think put the assignment into better historical context.
While exceptions exist, research shows again and again that when people are staring at a screen, or skip-jumping through a bajillion websites and apps, they are not learning well. Yes, college students are adults, and if they choose to spend class on whatever the new thing to replace Snapchat is, that’s their prerogative—but when it comes to course design, it is still the professor’s job to prioritize student learning.
And learning requires thinking. Hard, uncomfortable thinking, the kind where you swear you can feel gears turning, laboriously and painfully, in your head. And that requires intellectual space, which is precisely the opposite of the constant whirring of course wikis, and live-tweeting academic talks, and multimodal something-or-other, and wait, wasn’t I supposed to be reading Candide? In the words of the obscure Austrian playwright Johann Nestroy: The problem with progress is that it always seems greater than it really is. Every technological advance foisted upon students for its own sake results in less time to work on the material at hand; the more technologized courses become, the more their actual content recedes.
Also, come on. There are innumerable problems with higher education today, but one of them is not that college students don’t use technology enough. I’d even venture to say that they already use it too much. So much, in fact, that their brains are now wired differently—some might even say worse. It is not the professor’s job to acquiesce to a dystopian techno-future that, frankly, is downright frightening to anyone who can wrest their behemoth iPhone 6 from their gnarled talons long enough to think about it.
Looking this over, I realize that perhaps I, too, am a secret Luddite (after all, a month ago I deactivated my Facebook account)—but I do actually interact with college students on a regular basis, and I listen to them, and plenty of them want a classroom where the primary activities are thinking and talking. Learning how to think can be aided by technology, but it can be impeded by it as well. There are a handful of professors who really do embody the elbow-patched, technophobe cliché. But most of us know more about teaching and learning technology than our students do; we choose to use—or not use—certain advances because we have thought long and hard about what our students need. And our charges often need to destimulate, take a breath, sit, and think.