As any historian, psychologist, sociologist, or scientist will tell you, the truth of an idea has very little to do with how fast it spreads and how well it’s believed. Twitter, Upworthy, BuzzFeed, Tumblr, and others can spread simple, easy-to-remember notions among friends, acquaintances, and even news sites. Some of these ideas, like #YesAllWomen, are admirable and well-intentioned. Many of these ideas are terrible, however, and it drives me fairly crazy to see the same mistaken thoughts being endlessly regurgitated across social networks and online media. So at the risk of giving these ideas even more undeserved exposure, I present four technology fallacies I wish to extinguish.
Ninety-five percent of what you read on Valleywag, Pando, Business Insider, and the like revolves around business news about technology companies. Correspondingly, 95 percent of the tech business is the equivalent of marketing toilet paper in different colors and different wrappers. If you think that you are any more technically informed for reading this news, you may be fooling yourself. Perversely, the tech business often trumpets just how ignorant it is about the technological guts of the products that make it rich. For example, when I looked at Business Insider’s hit piece on Marissa Mayer, it was clear that a cabal of white male executive dinosaurs, none of whom knew the first thing about computer science, had ganged up to badmouth Mayer to writer Nicholas Carlson. None of them criticized her technical knowledge, of course. That wasn’t an issue for them, and it didn’t appear to be an issue for Business Insider, either.
The greatest computer scientists of all time are not people you’ve heard of, with the possible exception of Alan Turing. People like Alan Perlis, John McCarthy, Edsger Dijkstra, Donald Knuth, and Frances Allen are brilliant minds, and they devoted themselves to the theoretical underpinnings of computers that made possible all the software applications we use today. Many programmers are not even familiar with their work, but these computer scientists contribute more to computers today than Tim Cook does.
Since I write about technology and not business, I’m grateful that I get to ignore most tech reporting, or else I might go insane from seeing the New York Times report metrics like “5 million lines of code” as though they mean something (they don’t).
I have been enjoying the rich irony of entering the lily-white, all-American world of writing after my time in the tech world, where I worked with as many foreigners as Americans and even had several bosses who weren’t white. No, it was far from perfect; the gender imbalance remains acute and discrimination occurs much as it does elsewhere. But the portrayal of a white male techie monoculture seems to be some bizarre projection by a journalistic cabal made up largely of white men. Google’s ex-CEO Eric Schmidt said in 2013 that India-born CEOs head 40 percent of Silicon Valley’s startups. You wouldn’t know that from reading American tech journalism.
Likewise, while there is a libertarian contingent among programmers, there’s a considerably larger contingent of flaming lefty socialist Burning Man attendees. Many of them do not identify as straight (even some of the libertarians, like Peter Thiel). And there is no overlap whatsoever between geographical techie concentrations and Tea Party members of Congress—quite the opposite in fact. (I feel uneasy about the corporate tech money flowing into the campaign of Mike Honda’s challenger Ro Khanna, but again, Khanna is hardly Rand Paul.)
Since programming has historically tended to attract misfits and introverts, they shy away from communitarian do-as-we-say politics in favor of a more liberal individualism. Still, I sure can’t recall too many techies complaining about their taxes being too high or wanting to return to the gold standard (or buying bitcoin). Freaks and idiots make for better press than reasonably sane people, but please remember that they aren’t representative.
And did I mention the substantial trans community among techies? Among notable 1980s game designers alone, there was the hugely influential Dani Bunten Berry, Rebecca Heineman, and Cathryn Mataga (who collaborated with Slate poetry editor Robert Pinsky on Mindwheel in 1984). Writing articles about these people would encourage trans people and women a lot more than writing article after article about white male monocultures, but maybe they don’t get the clicks.
The stereotype of the programmer is that of the quasi-autistic savant staring at the screen, locked in battle with code and seeing the entire world as a stream of 0’s and 1’s. The digitalization of society has proceeded with such speed that figures like Steve Jobs, Bill Gates, and Larry Page have been made into geniuses—evil geniuses sometimes, but geniuses. For every paean to Jobs that extols him as the Moses of computers parting the sea of bits, there’s another portraying him as the Antichrist who accelerated the hydra of capitalism to throttle us all by the neck that much more quickly.
Make no mistake, those entrepreneurs all had or have a sharp combination of business acumen and tech smarts, employing programmers who walk the right line between speed and quality—erring on the side of speed, of course. But those skills often come at a certain price. Just as fiction writers often lose all sense of perspective of how the world works outside of their own creations, software engineers are given to see the world in terms too analogous to machinery.
This is only natural. But the smartness of programmers is one kind of smartness and not the only kind. Unfortunately, critiques of the engineering mindset often proceed from horrible ignorance rather than sympathetic understanding, and so programmers frequently conclude that they are indeed smarter than the rest of the world, which seems to them to be a mass of incoherent irrationality. The great novelist Robert Musil, who trained as an engineer before turning to writing, demanded “precision and soul” in the pursuit of a way of thinking that was neither sloppy nor technocratic, and that should serve as a guiding mantra for techies and nontechies alike.
The desktop is dying. Even the laptop is dying. Mobile is rising. Instagram is huge. Facebook is flat-lining. Or is it the other way around? The thunderous voice of daily tech journalism is that of the ever-present Now. The amount of press spilled over bitcoin and Rap Genius should embarrass any publisher. Even as we see technologies of a year or two ago decline, the tiniest shifts in the cool new thing merit microscopic dissection as though they will affect our great-grandchildren. If only we paid so much attention to climate change. Every change to Google or Amazon’s algorithms is treated as though it’s of life-changing significance but rarely evaluated in terms of what the core business model could look like in five to 10 years.
Wait, step back, and think about life 10 or 15 years ago, if you can remember that far back. Think about how little time you spent online, whether or not you even had Internet access, how you didn’t use Facebook, how you didn’t text or Skype or whatever. Try to imagine that many changes 10 years down the line. Facebook and Twitter will either be gone or drastically changed. Google will exist but likely won’t loom as large as it does today. Something we can’t even imagine will rule our lives and channel our thoughts. And there will be a whole new set of mistaken ideas that will be spread through it.