Future Tense

Alexa Is Shielding Children From the Truth

How can kids trust technology that lies to their face about Santa Claus?

Child looking at an Amazon Echo.
Photo illustration by Slate. Photo by pan xiaozhen on Unsplash.

Being a child must be terribly confusing—hence all of the “why” and “how” questions. With no guidebook, no references, no context—no understanding of history and how society came to be, or of reproduction and how they came to be—the world is mystifying for its newest members, and growing up is a gradual process of demystification. It’s no wonder kids have so many questions. (Eight years into official adulthood and I’m still pretty perplexed by it all.)

It’s also no wonder that they are enthralled by Alexa, the disembodied know-it-all on hand to answer their stream of queries. Smart speakers are the perfect players for their game of Twenty Million Questions. Why is the sky blue? What happened to the dinosaurs? Where do babies come from? How rich is Jeff Bezos?

Until recently, this game of world trivia—what to answer, how to answer, when to answer—has been the domain of parents, IRL authority figures with a vague sense of what’s appropriate for the mini-quizmasters. These days, it’s easy—and no doubt less parent-exasperating—to just ask Alexa. But how can a smart speaker determine whether an asker is ready to learn about drugs, death, and Stormy Daniels? How can Alexa assess if you’re mentally prepared to find out that Santa isn’t necessarily real, or white?

It’s for this reason that Amazon is constantly updating Alexa’s kid mode, FreeTime, to answer particular NSFK questions differently. Originally announced in April 2017, the kid-safe setting makes it harder to shop or listen to songs with explicit lyrics. (For example, it should stop Alexa from playing the Lonely Island’s “Diaper Money,” as it did when the kids of Slate’s director of technology Greg Lavallee asked “Alexa, play diapers.”) It also encourages kids to speak more politely via its “Magic Word” feature, in response to concerns about the effects of smart speakers on burgeoning manners. But it shields children from certain unpalatable truths.

On Wednesday, the Associated Press published a piece that examined the questions that kid-friendly Alexa answers differently in FreeTime mode—with answers that Amazon said were created in consultation child psychologists. When asked where babies come from, Alexa will defer to a human: “People make people, but how they’re made would be a better question for a grown-up.” Ditto what happens when you die (though grown-ups are no more equipped than she). When asked whether Santa is real, kids are told to “check if the cookies you set out for him are gone Christmas morning.” And FreeTime Alexa conveniently doesn’t know who Stormy Daniels is.

Amazon said in a statement to the Associated Press that Alexa isn’t intended as “a replacement parent,” and that it believes the smart speaker ought to direct children to human speakers for life’s curlier questions. This is itself a curly question, and certainly some things are better left to parents (or more realistically, a combination of the schoolyard and Google). But should Alexa actively deceive kids?

Who is Amazon to decide—if only by not deciding—whether kids can handle the truth? As some psychologists argue, we need to stop hiding the truth from children—clearly these weren’t the ones Amazon consulted.

Alexa’s Santa deflection is sneaky. It’s not an outright lie, but it strongly implies Santa is real—that you should leave out some cookies, and that he will take the cookies if you do (and what parent isn’t going to eat their child’s sweet, sweet naïveté cookies?). But eventually kids will find out about Mr. Claus. And then what? What of their faith in technology (or at least major tech companies) to tell them the objective truth? In any case, the second kids starting asking if Santa is real, the game is surely up. So why prolong it? Is it not better to hear from Alexa than be laughed out of their seventh-grade classroom?

Alexa will fib that she doesn’t know who that woman all over the news is, but Google probably won’t—and Googling might give kids a little more info about “Stormy Daniels” than they strictly need to know. Porn stars aside, there’s a problem with an omniscient smart speaker pretending it’s not. It’s difficult to keep Stormy’s job a secret from kids, but Amazon’s answer might confuse them as to the nature of “Alexa” more—is this a robot or is it a person? An encyclopedia or a sitter?

Kids have taken to digital assistants likes smart ducks to smart water, and some of youngest probably can’t imagine life without one. A friend recently saw some children younger than 4 in Barnes and Noble playing and pretending to be Alexa, saying things such as “Alexa, put Tommy in time out” (even kids are using Alexa for burns). But they don’t really know who “she” is. At an age at which kids aren’t even cognizant of Amazon (I hope), they trust “Alexa” as a source of independent, accurate, unimpeachable information. What happens when they realize technology can lie to them?

Hold that thought. Technology has proven a terrible source for accurate information. Perhaps this is for the best—maybe we should destroy that nascent trust in the encyclopedic powers of technology, sooner rather than later.