Users

Alexa Is Creepily Laughing at People for No Reason

This is the future of bugs and viruses in the era of voice assistants.

Photo illustration: an Amazon Echo device with a drawn-in speech bubble depicting laughter.
Photo illustration by Slate. Photos by Thinkstock and Guillermo Fernandes/Flickr.

As Amazon Echo Dot owner Gavin Hightower was heading to bed the other week, he encountered a disturbing Alexa bug. For no apparent reason, the device uttered a “very loud and creepy laugh.”

“There’s a good chance I’m getting murdered tonight,” Hightower tweeted after the incident.

Hightower isn’t alone: Numerous Echo device owners have reported their Alexas laughing spontaneously, unprovoked by their wake word (“Alexa”) or any other command. For other users, it’s more than just laughter. Some report their Alexa devices failing to fulfill their spoken requests, performing random other actions instead, and then capping it off with a guffaw.

Of all the bugs your smart assistant could encounter, uttering a high-pitched “witch-like” laugh ranks among the creepiest possibilities. (Other podium contenders: Saying “I’m watching you” or actually making a threat.) The issue has now evolved into a trending moment on Twitter, and Amazon has responded to the phenomenon, telling the Verge on Wednesday that the company is “aware of this and working to fix it.”

The whole scenario reminds me of Mat Honan’s delightfully dystopian view of the future of connected homes for Wired in mid-2014:

I wake up at four to some old-timey dubstep spewing from my pillows. The lights are flashing. My alarm clock is blasting Skrillex or Deadmau5 or something, I don’t know. I never listened to dubstep, and in fact the entire genre is on my banned list. You see, my house has a virus again.

Honan’s four-year-old imagined future seems eerily on point. However, at the time, Amazon had not yet launched its revolutionary Echo; it and other digital assistants are now poised to permeate every facet of our existence.

While they’ve had a remarkably good run, it was inevitable that our digital assistants would crack. Friday delivered a tremor of foreshadowing: Amazon’s virtual assistant briefly “lost her voice” due to issues with Amazon Web Services. (It’s unclear if these reported issues are related to that downtime.)

Whatever the reason for the laughter—and hopefully Amazon will tell us whether this is some sort of software malfunction, a hack, or the work of a disgruntled software programmer—this isn’t going to be a one-off scenario. Amazon is adding capabilities to its assistant on a near-daily basis and so are its competitors, like Google Home. As it gets more bloated with code, the possibility a strange bug could slip through also increases. Consider Apple’s iOS. In November, a strange bug cropped up that autocorrected the letter “I” on iOS devices. Over the years, the operating system has suffered from numerous other bugs: The alarm clock not working, apps becoming unresponsive, and the phone crashing when it encounters a certain character.

However, while these bugs can be annoying, inconvenient, or both, they’ve typically never verged on creepy. There’s something different about a virtual entity imbued with personality going haywire, particularly one that’s become a part of the household. In the future, it’s not far-fetched to think Alexa may sometimes have mix-ups when responding to queries or commands, be unable to reach certain services when they go down, refuse commands from a specific voice, or decide it’s time to speak, even though she hasn’t been spoken to.

Then there’s the possibility of hacking. Researchers in China last year discovered that it was possible to control a virtual assistant using ultra-high frequencies inaudible to humans but detectable via smart speakers’ microphones. Meanwhile, a U.K. researcher found a way to hack Amazon Echo devices which, while it likely wouldn’t transform your assistant into a laughing demon, could surreptitiously transform it into a spy-style listening device. While both of these situations require direct access to someone’s Echo—which makes the potential for hacking much lower—it still shows that the devices are susceptible to foul play, and that could lead to even creepier Alexa scenarios.

The tech savvy could have predicted some of those risks, or the hacking of Wi-Fi–connected devices like baby monitors and thermostats. But a smart home assistant laughing of its own accord for no reason? This is a technological fail that only Hollywood could have imagined, and unfortunately, it’s only the beginning.