Uneasy smart-home inhabitants have long wondered whether their obliging devices are quietly eavesdropping on their private conversations. After all, if your smart speaker is listening for its wake word, doesn’t that mean it’s always listening? Now people’s paranoia appears to have been confirmed—and then some—with Seattle local news station KIRO 7 reporting that a Portland family’s Echo recorded their private conversation and then sent it, as an audio file, to someone in their contact list.
A former smart-home enthusiast, named only as Danielle, told KIRO 7 that she and her husband recently received a call from her husband’s employee in Seattle, telling them to unplug their Echo immediately. The employee then went on to tell the couple that he had received recordings of their (mercifully) mundane chitchat—a conversation about hardwood floors, which Danielle and her husband had been having. (It’s not clear from the report whether the employee received the audio files as a text message, email, or voicemail).
Danielle called it a total invasion of privacy, and says she won’t be connecting her numerous Echo Dots ever again. Danielle is clearly spooked, as are many Twitter users, who say they too are disconnecting their devices. It’s the creepiest thing Alexa has done since its phase of random and unprompted cackling, another incident that led users to unplug.
When Danielle called Amazon over the incident, the company went through her logs and confirmed that the files had been sent. It did not, however, say why. From KIRO 7’s report:
“They said ‘our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we’re sorry.’ He apologized like 15 times in a matter of 30 minutes and he said we really appreciate you bringing this to our attention, this is something we need to fix!”
But Danielle says the engineer did not provide specifics about why it happened, or if it’s a widespread issue.
An Amazon representative told KIRO 7 that “Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future,” seemingly confirming the eerie error.
So something happened here. But it’s not clear exactly what caused this “extremely rare occurrence”—and whether it’s worth panicking over.
The details in the KIRO 7 report are sparse, with no mention of whether the indicator light came on, what the wake word (the thing you say to make Alexa listen for a command) was, or what the name of the employee was. Could Alexa have misinterpreted something the couple said about flooring as a request to call or send a voice memo to the contact? The laughing incident was supposedly a series of “false positives,” which led Amazon to disable the easily misheard command Alexa, laugh. In a 2017 incident, a smart speaker called the police on a domestic abuser when it misinterpreted “did you call the sheriffs?” as “call the sheriffs.” Danielle’s conversation with the Amazon engineer would seem to indicate this was a case of misunderstood messages. “He told us that the device just guessed what we were saying,” she told the news station.
KIRO 7 reports that the digital assistant didn’t audibly announce what it was doing, something that the device is programmed to do. Could it have been a hack, then? Researchers have found that hackers can send secret commands to digital assistants, undetectable to the human ear, through white noise or music—they can even mute the device first so the owner won’t hear the AI respond. Or was Amazon itself secretly logging the chat in an attempt to sell the couple some building supplies, with the transmission an embarrassing blunder rather than a creepy computer?
With limited information, it’s hard to know exactly what went wrong here. Whatever the case, Alexa shouldn’t be so record-button-happy.
One thing is for certain, though it shouldn’t really come as a surprise to anyone: Alexa hears all, and she’s always ready to listen.
Update, May 24, 2018, 6:20 p.m.: In a statement to Recode, Amazon provided an explanation for what happened:
Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.