Since I was a kid, one aspect of my relationship with video games has remained the same: I play them to unplug. This does not make me unique. I grew up in the ’80s and ’90s, when a trip to the mall arcade was the child equivalent of an adult splurging on a nice steakhouse. The awful racket inside, the intimidating teenagers (who in hindsight just stuck to themselves), and the unfair games played for bragging rights over scores had nothing to do with our real lives. They were a thrilling way to kill, at most, an afternoon. We preferred the Nintendo’s meat and potatoes because you could own the games, not have to deal with anybody else, and keep playing without needing to pump in a single quarter.
This might make me sound antisocial, like someone who grew up escaping into games. There might be some truth to that, but I don’t really think of it that way. To me, it’s more like I was dealing with people all day long at school, figuring out who I was, and loved farting around in these weird little worlds with imaginary problems and catchy repetitive music and sloppy mistranslated dialogue. When all there was on TV were Looney Tunes reruns, games like Mega Man 2 and Bionic Commando were like being transported to a vibrant fever dream. These alternate realities rarely made sense and were supremely charming because of it. You weren’t playing for the story because games then really only had loose premises. There was a sort of Zen-like meditative quality to all those games: You were alone in a universe where nothing made sense but governed by a reassuring balance and alien logic once you got started.
That “alone” part of the equation for video games has been changing in recent years. Yes, long ago there was an explosion of online gaming’s popularity, giving players the ability to join up with or hunt strangers, friends, and (occasionally) new friends you’d meet through a game. But there’s also a newer, largely unarticulated trend: the end of being alone in video games. Now, even on single-player outings, you’re typically teamed up with a computer sidekick for the duration of the adventure. Although there are scattered earlier examples of companion characters—like 2004’s Half-Life 2 and 2001’s Ico—their crossing over from albatrosses to something more meaningful is a recent development, kicking off with 2012’s The Walking Dead and 2013’s BioShock Infinite and The Last of Us.* (Curiously, all three charged you with protecting a young girl by making difficult decisions with them in the face of extreme danger.)
I am not warming up for a long, curmudgeonly rant about how video games used to be better in my day. But this is an interesting and new experiment for video games, one that raises a lot of questions. Or, at least, one question: Why?
I have a hunch. The most ambitious examples of the computer sidekick are always plot-critical characters who grow and change and have actual personalities and points of view. They accompany a shift in the sorts of stories told in video games, and how the player takes part in them.
Take those old Nintendo games I grew up with. In The Legend of Zelda, your interactions with others are limited and meager. At the beginning, an old man in a cave with a sword at his feet and blocky text above him warns you that it’s dangerous to go alone. You never respond. You can’t even if you want to. This was how video games typically continued to portray other people you encounter—as living and cryptic vending machines. Do you remember Mario talking to those turtles?
The supposed explanation for video games’ silent heroes was that voiceless avatars smoothed the way for the player to imagine themselves in the role. Mario, or whoever, was incapable of contradicting the player’s intentions or communicating any opinion. But it’s also likely that technology, budget, and time constraints kept silent protagonists, well, silent for so long. As time has gone on and technology has advanced and budgets have ballooned, many video games have essentially become 20-hour-long movies now able to circle back and dabble with the sort of creative exploration that a secondary character brings.
This new frontier brings with it a new spectrum of experiences. One approaching future for video games, it seems, is to join the always-on, always-connected present we all experience in our daily lives, by having A.I. wingmen “pinging” us with dialogue as the larger shared quest progresses. Newer titles like Wolfenstein: Youngblood, released in July, and 2018’s Far Cry 5 have cast these roles with something more akin to stand-ins for a second player, if you don’t have anyone to play with, or glorified hired guns. But these recent ripples, I think, are just the shallow end of the pool.
The strongest example of this shift is in 2018’s God of War, a dramatic reimagining of a series that began in 2005. In the original, you play as Spartan warrior Kratos, who wages war alone against the gods in a world that is sort of a mashup between Greek mythology and Heavy Metal magazine. While the more recent reboot has its share of brutal violence, the slaughtering isn’t the most powerful or even most dominant note here. Instead, it punctuates a quest Kratos embarks on with his son, Atreus, to bury his mother’s ashes.
The father-son relationship progresses from icy and reluctant to more tenuous and resentful to, finally, making amends and with both promising they will try harder. Along the way, Atreus crawls into areas you can’t to help open doors and also shoots arrows at whatever you are trying to kill. But after completing the game, that’s what I remember the least—if at all. I walked away watching Atreus grow from a small boy into a young man, and wondering whether he could keep his heart open and avoid bitterness while steeling himself for the inevitable sequel hinted at in the end.
This and other trends suggest that someday, you’ll be able to play video games, say it’s really for the story, and not sound like an apologist. We aren’t there just yet. All I know is, for now, I am virtually never alone in newer video games. These first steps aren’t perfect or seamless—there are still hiccups and bad habits from how video games have always been made. I’m like the star of a play where everyone but me has the script, and I’m constantly reminded to get off-book and perform as written and rehearsed by everybody else. Even though my avatar can talk, it doesn’t speak for me. Yet. (To be fair, older video games would remind you when you were veering off-course with nudging dialogue. But these newer games toss those breadcrumbs in your ear maybe every 20 minutes.)
This shift hints at a broader change. It’s possible some of the biggest video games—even those wrapped in whimsical settings or focused on mayhem—can offer a new or at least different window into our relationships and identities. They demonstrate how who we are when we hold a controller intersects with who we are in life. Being current with gaming might mean no longer being unplugged but more in tune with who you are and who you want to be, or understanding others in your life.
Or as video game writer Tom Bissell told me when we recently had a conversation about this: “[G]iven the state of our society and the state of our politics and the state of the world, I think there’s something to be said about going through harrowing video-game combat with a pal [and] having someone hold your hand through the nightmare.”
Correction, Sept. 30, 2019: An earlier version of this article misidentified the year Half-Life 2 was published. It was 2004, not 1998.