The Sinister Realities of Google’s Tear-Jerking Super Bowl Commercial

Text says "Here's what you told me to remember" in a speech bubble on a white screen with the blue, red, yellow, and green Google Assistant logo.
Google

After Google’s “Loretta” Super Bowl commercial aired last night, my friend grabbed his phone and started texting his parents, both of whom are in their late 70s. “This is so perfect for them,” he said excitedly, undoubtedly imagining a future in which his parents can ask Google about personal details they’ve forgotten. “What’s my sister’s favorite food?” his dad might ask while grocery shopping. “When’s my youngest son’s birthday?” his mom might ask, without having to worry about anyone’s reaction to the fact that she forgot it.

Losing the ability to remember memories and details about people you love is a special form of hell. That’s why it’s hard to oppose anything that genuinely brings comfort to people with dementia. As a society, we outsourced our memories and our knowledge to Google a long time ago. Why remember something when we can look it up online? But the commercial disturbed me because while we’ve already given Google access to our dearest memories, there’s something scary and dehumanizing about letting Google dictate what happens with them. It’s possible that Google’s technology could help some people cope with memory loss and with the surrounding grief—everyone’s different—but Sunday night’s commercial proposes an intervention that goes beyond helping. If it came to fruition, tech companies may end up dictating how we manage difficult personal moments, raising the idea of a future in which disembodied voices and algorithms supply us with de-contextualized bits of information about who we were, who we love, and who we are.

It’s also unclear exactly how well this would even work. The idea that someone can provide a list of particular facts and memories for Google to remember seems useful. In the commercial, the man tells Google that Loretta liked scallops and that she had great handwriting. But does the software show photos or repeat memories without having specifically been asked? What if someone asks Google to remember a series of memories, but then never remembers to ask about them? Would Google prompt the recollections itself? Without a person’s intent or agency, does the software curate a slideshow or information dump based on Google’s secret proprietary algorithms? That seems meaningless at best and damaging at worst.

Remember that Loretta liked scallops, that she had great handwriting. As nice as those facts might be, they’re extracted from the stories and contexts in which they matter. Instead of remembering that Loretta liked scallops, how about remembering and then hearing the story of how she once tried fishing for scallops or a romantic dinner at an Alaskan restaurant with the best scallops she’d ever tasted? People aren’t comprised simply of what they like or what they’ve done, and by extracting and presenting these pieces discretely, Google offers a fragmented facsimile of a memory.

In Plato’s Phaedrus, Socrates explains his criticisms of writing. One of them is that “writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence.” Photos, like paintings, are two-dimensional representations of real people and places. But putting a physical photo album into someone’s hands, as opposed to images on a screen, puts memories into visual and tactile context. One can see the other photos on the page and on the page after that. Maybe there’s information written on the back of them—one can pull them out and turn them over. One can see whether they’re Polaroids or have round corners, or whether they’re yellowing with age. Those sensory details offer more than an algorithm ever could.

The inclusion of both context and sensory details is important for reminiscence therapy, a recommended treatment for people with dementia. Reminiscence therapy consists of patients talking to other people about their lives and experiences and supplementing those memories with tangible aids, such as photos, memorabilia, or songs. During either group or individual sessions, caregivers or family members prompt a chronological walk down memory lane. Reminiscence therapy has been shown to improve patients’ cognition, mood, and general functioning and to decrease stress on caregivers. This treatment leverages sensory details and contextualizes memories and experience with props and people who know the patients. Those are two advantages Google can’t offer.

It’s impossible to think about Google without worrying about privacy, especially with digital assistant devices. Considering that users share a lifetime of personal details and photos, one should wonder what happens to all of that information. It’s hard to regard Google’s supposed interest in helping older people as anything other than dubious, especially given that Google Assistant’s privacy violations were serious enough for the EU to force Google to stop transcribing voice recordings. The intended market for these amplifies privacy concerns, as older adults are particularly vulnerable to online fraud, scams, and data breaches.

As with robots that help the elderly stave off loneliness, technology is often better than nothing. But the problem with Google’s “Loretta” commercial is that it suggests technology can replace a carefully assembled photo album or an afternoon reminiscing with family. People are not a series of facts or images. No matter how much big tech insists otherwise, our memories are not commodities. Neither are we.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.