“People imagine that programming is logical, a process like fixing a clock,” Ellen Ullman writes in her essay “Outside of Time: Reflections on the Programming Life.” “Nothing could be further from the truth.” Instead, writing code is “an illness, a fever, and obsession. It’s like riding a train and never being able to get off.” Ullman worked as a computer programmer and software engineer for 20 years, beginning in the late 1970s, when the profession drew from an eclectic population of PC hobbyists, to the early 2000s, when she became a full-time writer. Her first essay collection, Close to the Machine: Technophilia and Its Discontents, published in 1997, soon became a cult hit among programmers with a literary or intellectual bent for its extraordinary evocation of the agonies and the ecstasies of engineering software. Ullman’s prose is as elegant as her code, and she went on two write two acclaimed novels.
Life in Code: A Personal History of Technology marks her long-awaited return the themes that first made her name: what it’s like to share our world with sophisticated machines; the myriad, nearly undetectable ways that they change us; and the fundamental ways they cannot. One of the book’s highlights is a long, reported look at the idea of artificial life, and it ends with a perceptive, respectful and wholly unsentimental tribute to Ullman’s late cat, Sadie. I contacted her by email to ask about her hopes and fears for the future of the industry, her experiences working in technology as a woman—and that infamous Google memo.
Laura Miller: You were an English major who fell in love with technology through the first cheap, portable video cameras and their possibility for communication. Later, in the same spirit, you bought an early PC from Radio Shack in the 1970s. You taught yourself to code. Were self-taught programmers pretty common back then?
Ellen Ullman: On my first real job, in 1979, we were a wonderfully ragtag group. A former Sufi dancer. A master of art history. Two English majors. A French guy who stunk up the place with Gauloises and refused to follow any of the coding conventions but did great stuff that challenged us. It was fun to work with them. Most of us were exploring.
On the other hand, there was also a group of highly trained software engineers. At the time, in general, computing was a subspecialty in engineering. And engineers with M.S.es and Ph.D.s were not inclined to go code up applications in rooms full of weird people.
And what about now? It seems that more formal training is expected today.
Now the web requires a very specific skillset. People can learn those skills in meetups and coding groups. But the back-end work, on the server, involves a deeper understanding of areas like operating systems, algorithms, and chip design. That is the realm of computer science and involves serious schooling. Highly motivated people can learn on their own, but they have to be obsessive and passionate about learning computing at those levels. In any case, it takes a measure of fire in the belly to do any sort of coding.
Were there more women doing it back when there wasn’t an established, formal pathway through computer science education to the work? Was it more comfortable for you when it was more ragtag and miscellaneous?
Now that I think of it, yes. Sybase was lousy with women who had Ph.D.s in things like anthropology and all but dissertation in Greek classical literature. Both my job in 1979, and the one at Sybase, in 1985, had women in responsible technical positions. Seems weird—but that comfort of having more women around let me learn more easily from men.
I’m curious about how you managed your self-confidence over the years in a field where colleagues are notoriously blunt with each other and where as a woman you weren’t always welcome. Sometimes you describe yourself as discouraged by the skepticism of others, as when your father asked you to write an amortization program and then later suggested that you give up on it because you were “struggling.” One of the essays in the book recounts being at a party in the late 1990s, chatting with Sergey Brin and Larry Page about symmetrical multiprocessing, and they instantly asked if you wanted a job. You declined because you felt like “fraud” who “didn’t know a damn thing.” But there are other jobs that you managed to talk your way into despite not really knowing the machines or languages involved. Do you regret not going to work at Google?
I have to think about it because I haven’t thought about the differences between situations where I was cowed and the ones where I pushed through with (feigned) confidence. In the beginning, I took guidance from my English honors thesis on Macbeth—that play is full of confusions of tenses, what happened before what. If I could untangle the knots of time in Macbeth, I could deal with the tangles of BASIC—a language that famously let you create what was called spaghetti code. In general, I thought: Well, this is hard, and “hard” for me usually includes intrigue. The same sort of intrigue carried me throughout. I think if you don’t find the difficulties of computing tantalizing in some way, programming is not going to work out for you.
The Larry Page/Sergey Brin encounter was of another order. The work involved in “symmetrical multiprocessing” (the machine having two “central” processors”) is a hard problem. And this was a “hard” that scared me, because I was teaching myself but was not ready to do it professionally, I thought.
Do I regret not saying yes? For years I thought, no, that I had to get out of the boy culture of programming. But as I think of it now in reaction to your question, I do have a measure of regret. After all, I had faked my way into working on machines from minicomputers to mainframes, languages from BASIC to C++, and a variety of operating systems. (By faking I mean hiding my incompetence as best I could while I learned on my own and from people around me.) So maybe I should have taken some time to consider working for them. But I guess it was that I knew Larry, and each time I was around him, I knew I was in the presence of someone who existed on the far-right edge of bell curve. I’d say four sigmas to the right of the mean. And I could never feel smart around him.
Speaking of the “‘boy culture of programming,” let’s get into that infamous Google memo. From your book, I gather that there are ways the culture of programming can be just cluelessly monocultural—like the geeky pop culture humor of an online course you took to get an idea of how effective such courses are—and ways that it is overtly hostile to those who are different. Were the attitude and ideas in that memo familiar to you?
First let me say that what I’m going to say about the boy culture is not nearly a complete description of that world. I have worked with men who respected me, and, by watching them and asking questions, I learned things I needed to know. And I have my own geeky side, so I enjoyed working with the sweetly geeky guys who were playfully brilliant.
Now … as to the “memo.” James Damore is not the most extreme example of a demeaning, hostile man vehemently defending the treehouse—“no girls allowed”—but he is the first one I’ve encountered who wants to shout from the rooftops that the current gender imbalances—and injustices—are just fine in technology, as he sees it, are there because there is good reason for the imbalances to be there and that working to bring in the excluded people amounts to political coercion.
In general, the more direct the efforts are to bring minorities into the technical world, the fiercer the counterpush. For instance, a 2015 South by Southwest gaming conference planned to have a panel about women in gaming, a world that is famously, overwhelmingly male. The organizers received death threats if they didn’t removed the panel. The panel was removed.
This isn’t the forum to ask questions about the scientific studies Damore cited, which discuss the effects of hormone exposure in utero. But I would like to say that, whatever happens in the womb, the moment children come into contact with others, their brains begin a process of tumultuous change. Synaptic connections are strengthened or weakened. It is not a question of nature vs. nurture. It’s a both-and. The brain is plastic, continuously changing its organization.
My overall reaction to the memo is to ask: Why do we need women involved anyway? Why does the technical world need minorities? The short answer is they bring in new viewpoints. They shake up the segregated male culture. It’s not a political “should.” It’s a necessity for the healthy evolution of technology.
Did you ever have to deal directly with attitudes like Damore’s? If so, how did you do it?
I had a boss who interrupted me constantly to say, “Gee, you have pretty hair.” I leaned to one side, said, “I’ll just let that shit fly over my shoulder.” I got a great deal of knowledge from him. I once worked to fix a system while the client, a sweaty man with pendulous earlobes, rubbed his damp hand up and down my back. (I concentrated on how I might put a bug bomb into his system; but didn’t wish I had.) I worked with a group of researchers who made it clear I was a lesser being and “stupid.” I reminded myself that they were nasty to each other, only less so, and I made a lot of money from them on my contract.
That’s a rather glib answer to the question of how anyone looks prejudice in the eye, which is so complex I can barely touch it here. I can only say that I had to hold onto my love for the work and refuse to be diverted. You are not going to drive me away! I looked for support from others who faced prejudice by nature of gender or skin tone or where they came from or sexual identity or … Also from sympathetic good ole straight white men, of which there were indeed more than a few.
You make the point that we tend to view programming as logical and rational but that in fact the technology a culture produces is always shaped by the culture that makes it. How does the culture of programming, or the larger culture of the tech industry, shape the technology it makes?
Software and digital devices are imbued with the values of their creators. And the ones who decide what directions computing will take come from a segregated society composed mainly of young men, white and Asian, as we know by now. They are the ones who make assumptions about who the users are, who should be served by the digital technology, the sort of society that will form around it. Witness the championing of disruption, intentional breaks in social, economic, and cultural relationships—down to the most intimate parts of personal life—entire means of human interaction designed to “change the world!” That is: Have startup founders and investors profit hugely from their intended rearrangements of human life.
We need to involve women and minorities and people who come from all social classes because they bring in new sets of values. The newcomers deepen the conversation. They carry in fresh sources of creativity. They enrich our understanding of the relationships between humans and the digital world. They ask new questions: What do we want from all this stuff? And who is included in this definition of “we”?
One of the persistent themes of Life in Code is early optimism about technology and its potential and later disillusionment, which for you is mostly about how it’s being used to erode privacy. You write of being astonished at how much of their lives people willingly expose on the internet. To the best of my knowledge, you don’t use social media, which is how so many of us experience “the internet” these days. I assume that’s largely for privacy reasons, but how do you feel about this massive change in the way the internet is being used by the average person?
I have to say that, throughout, I’ve kept my love for technology, for the beauty of engineering—thought becoming things that work!—and the elegance that can reside in code. The disappointment is not for the technology itself but what is has been used for. The loss of privacy, the intended disruption of social and intimate life.
I do use social media. I’m back on Facebook and enjoying it. I hate the company’s constant fiddling with the format. It forces you to rearrange your thoughts and remodel the forms of your interactions.
I won’t use Twitter. Twitter posts are thought-farts. I don’t care about unconsidered thoughts of the moment.
I like Twitter as an overall social phenomenon, though. It says what great numbers of people are thinking and concerned about. Well, that’s selfish of me. I don’t want to be involved, but I like what comes out of everyone else’s involvement. OK. True.
I don’t want to be interrupted more than I already have to be. I’m glad about the turn to texting instead of phoning. The “ding” is a sweet sort of sound. But one has to go to email to express anything complex and thoughtful. Email allows pauses, but it’s still lurking out there all the time. A little devil with a pitchfork poking you to check your mail, check your mail.