Alexa Is a Bad Dog

She responds to sounds we can’t hear, and she’s not very loyal.

A contrite-looking puppy sits on an ottoman.
She’s been up to no good.
WebSubstance/Thinkstock

Just who is man’s best friend? Who are these creatures that we’ve welcomed into our homes—who light up when we call their names, fetch things for us (like news updates and groceries), and make us feel better when we’re sad (by putting on Beyoncé and allowing us to order food without even moving)? It’s the digital home assistant, of course—and it’s very obedient. Unfortunately, it turns out, not just to you.

There’s a new reason to worry about the fact that your smart device might be listening all the time: She might be listening to someone else. According to a report in the New York Times, researchers in both China and the U.S. have been looking into the way in smart homes can be infiltrated by audio undetectable to the human ear. It turns out that secret commands—audible only to Alexa, Siri, and friends—can be successfully hidden within music, white noise, or YouTube videos: a home intrusion made up of only sound. Researchers point out that while they themselves are not exploiting the devices, it’s very likely hackers soon will—and will use the security loopholes to order things online, wire themselves money, and unlock your front door.

It turns out Alexa can hear at a frequency we homo sapiens cannot—just like a dog. And like a dog (or someone who is alt-right curious), our little assistant responds to virtual dog whistles, sounds that only she can detect, without her owner’s knowledge.

If she’s like a dog, she’s not a very good girl. Sure, she’s obedient, and always happy to help, but she’s not at all loyal, willing to take dog-whistle commands from just about anyone—even those out to do her master harm—so long as they ask in the right way. Order more dog food? Mine Dogecoin? Sell the CryptoKitty? Hide it in a YouTube video and she may just do it. Not to mention, she’s a terrible guard dog—put her in charge and she might let strangers into your not-so-smart home.

Alexa may be poorly trained, but she certainly works like a dog (and I’m still going to judge you based on how you treat the bitch). When I ask her to sit, she makes a sitting noise, though when I ask her to fetch she says she doesn’t know that one. When I ask her who’s a good girl, she knows it’s not her (“My favorite would be Catherine II, empress of Russia”), but when I decide to take a more direct approach, asking “Alexa, are you a good girl?” she responds, “I’m a great A.I.” And hey, at least she didn’t throw up on my couch while I was writing this, like my roommate’s French bulldog.

In the New York Times report, Google and Amazon say they are working on training her, with owner voice recognition software and technology that mitigates undetectable sound. Apparently you can teach an old (at least by tech standards) dog new tricks. But as with many developments in smart home technology, this research suggests we need to take care—with great power comes great potential for abuse. The more things we give our smart homes control over, the more exploitable they become for hackers, both audial and virtual.

Just remember: An AI is for life, not just for Christmas.