More than 50 million smart speakers have been installed in American households. For police, that means 50 million potential virtual witnesses to crimes that occur in the privacy of one’s home. But the legal protections for this type of privacy-invading, Internet of Things–enabled evidence are still very unclear.
The question matters because one of those smart speakers was just called to be a witness in a brutal double homicide in New Hampshire. Timothy Verrill stands accused of stabbing Christine Sullivan and Jenna Pellegrini to death over suspicion that one of them was a police informant. An Amazon Echo was present at the crime scene, and this month a state judge ordered Amazon to turn over the recordings from the device. As of Monday, Amazon has not decided whether it will comply with the court order.
The New Hampshire murder case provides a window into the future of digitally assisted prosecutions and the emerging internet of evidence. It shows both the revealing power of “smart” consumer surveillance technologies and a potential overreach by police investigators. It also raises the hard questions of who should be responsible for protecting personal privacy—companies or the courts, and by what legal standard.
The facts are that on Jan. 27, 2017, Sullivan and Pellegrini were brutally stabbed to death. Their bodies were discovered under a tarp by Sullivan’s boyfriend, Dean Smoronk, when he returned home from a trip to Florida. The question for police investigators was who killed the women and why. So, the search for evidence began in and around Smoronk’s house.
The home provided a wealth of digital clues. First, the high-tech home required a biometric reader for entry. The door was programmed to allow only three people to enter: Smoronk (the homeowner, who was out of town), Sullivan (one of the victims), and Verrill (the suspect). Second, video security cameras caught Verrill entering the house a few hours before the slayings wearing a flannel shirt. Investigators recovered a flannel shirt wrapped around three large knives buried in the yard after the stabbings. The cameras also caught Verrill dismantling the cameras during the hours just beforehand. Finally, video surveillance cameras outside local Lowe’s and Walmart stores showed Verrill purchasing salt and ammonia cleaning products, the residue of which was found inside the house (apparently in an effort to clean up the bloody scene).
These clues and others offer a compelling circumstantial case that Verrill was present at the house during the time of the killings, with the women, and likely tried to clean up the incriminating evidence. But the evidence doesn’t tell us what happened during those fateful hours.
Enter the Amazon Echo and the hope that the government would be able to turn Alexa into a prosecution witness. The Amazon Echo works by continually listening for questions and commands and responding appropriately. The device is programmed to listen for “wake words” and only then respond to the resulting query. Once the wake word is issued, the resulting question is recorded on Amazon servers. For police detectives who know a homicide has occurred in the general proximity to a smart device, the recordings offer a potential gold-mine of clues. “Alexa, how do I bury a body?” would be a solid piece of incriminating evidence after a killing.
The problem, of course, is that without listening to the recordings police do not know whether any such helpful evidence actually exists—they have a hope, but no actual knowledge, that anything was recorded by the device. The further problem is that the constitutional requirement of “probable cause” to obtain evidence stands in their way.
Probable cause is the legal standard that courts require to obtain certain forms of evidence. It is a standard of certainty well less than “beyond a reasonable doubt” and likely less than a “preponderance of evidence” (51 percent), but still significant enough that it can result in an individual’s arrest or their home being searched or property seized. Judges take the standard seriously because it is the only legal protection from the power of the police to search you or your stuff. Applying the probable cause standard to the facts, police must believe to a reasonable degree of certainty that the Amazon Echo recordings will contain evidence of the crime.
The judge in New Hampshire agreed with police that there was probable cause evidence would, in fact, exist on the Amazon Echo recordings. The court ruled:
The court finds there is probable cause to believe the server(s) and/or records maintained for or by Amazon.com contain recordings made by the Echo smart speaker from the period of Jan. 27 to Jan. 29, 2017 … and that such information contains evidence of crimes committed against Ms. Sullivan, including the attack and possible removal of the body from the kitchen.
But is that true? Those recordings may exist, but there’s hardly a reasonable probability that they do. There is no evidence that a wake word was uttered or whether the device was ever queried.
Judges facing these requests are in a quandary. To know that a crime has occurred in a particular place does not necessarily mean that a smart device holds clues to the crime. If all police needed to prove was probable cause of the underlying crime, they could demand all available smart data after all suspected crimes. That would mean if police have probable cause that you are using drugs in your home, police could demand your Echo information just in the hope that the device caught incriminating comments. If police reasonably believe that there is domestic abuse in the home, police could demand your Echo information. In essence, probable cause of a crime would substitute for probable cause that the Echo has information about that crime. It is a big and important distinction if you think about all of the smart devices around you, on you, and in your home.
It is also a hard question because in the analog world, judges have become rather used to signing warrants to search homes, cars, and diaries after a crime. The same logic that justifies a warrant to search a murder suspect’s home because it might (or might not) offer clues to the homicide has guided search warrants for decades.
The open question is whether the information’s digital nature makes these admittedly overbroad law enforcement requests different in kind and whether the world of digital clues—from our smart homes, cars, toothbrushes, and beds should be opened up to investigators hoping for a lucky break. Would a judge also be willing to allow police to obtain a murder suspect’s smartphone location data, search history, Fitbit data, or black-box car data, even without evidence that they had any particular relevance or connection to the crime? The answer may well be yes, but to arrive there, we need to consider the game-changing privacy questions at issue.
Because of these privacy and particularity concerns, Amazon and other technology companies have pushed back on requests for data from smart devices. In the New Hampshire case, Amazon stated: “Amazon will not release customer information without a valid and binding legal demand properly served on us. … Amazon objects to overbroad or otherwise inappropriate demands as a matter of course.” In another murder case where police demanded Amazon Echo recordings, in Arkansas, Amazon filed a 91-page legal motion objecting to the request claiming a commercial free speech right to the information requested. Amazon eventually relented when the homeowner consented to the search, but the aggressive litigation strategy hinted at is oppositional stance. And interestingly enough, the evidence helped clear the suspect in Arkansas.
Amazon’s litigious response could well be motivated by self-interest arising from fears of being overburdened with law enforcement hunches. It might also be a business calculation that if the Echo becomes the witness, Amazon becomes a snitch (which might not be great branding). But in this case, Amazon’s lawyers may also be right on the law and the larger principles at stake.
Judges all too often accede to law enforcement requests for evidence because they don’t understand the technology at issue. In this case, without evidence that the wake word was used or that the device was ever queried, is there any reason to believe that the Echo recorded anything relevant? Conflating probable cause of a crime with probable cause to search the Amazon Echo for evidence of that crime is probably an overreach, and opens up inquiry into all forms of Internet of Things–enabled evidence.
So, perhaps it is a good thing that Amazon lawyers are pushing back on overbroad evidentiary requests, forcing police, prosecutors, and judges to think through the logic of their demands. Digital clues from smart devices open up powerful new police powers at a time where big-data policing is already undermining privacy and security. Lawyers at Amazon have seen the dangers and are signaling a fight ahead, but it should not be up to private companies to protect personal privacy rights. In our A.I.-assisted world, we defer to Alexa on many questions, but the meaning of probable cause probably shouldn’t be one of them.