At around 3:45 a.m. on March 24, someone in Fort Lauderdale, Fla., used a mobile phone to Google “chemicals to passout a person.” Then the person searched Ask.com for “making people faint.” Then Google again, for “ways to kill people in their sleep,” “how to suffocate someone,” and “how to poison someone.”
The phone belonged to 23-year-old Nicole Okrzesik. Later that morning, police allege, she and her boyfriend strangled 19-year-old Juliana Mensch as she slept on the floor of their apartment. The Google searches, along with incriminating text messages between Okrzesik and her boyfriend, came to light as authorities investigated Mensch’s death. But what if they could have been alerted to the suspicious-sounding searches immediately? Could they have rushed to the apartment and saved the girl’s life?
In Minority Report, police use mutant psychics to predict murders before they happen and lock up the would-be killers. The mutant psychics are fantasy, but when we keep hearing about cases in which people Google their crimes before they commit them, the concept of a police “pre-crime” unit is no longer so far-fetched. The most interesting thing about the idea of using Web searches to predict murders: It might be perfectly legal.
Police already draw on other types of data to anticipate crimes. Police departments in Chicago, Los Angeles, and Santa Cruz, Calif., have been experimenting with “predictive policing,” in which computer algorithms sift through reams of crime data to tell officers where and when crimes are likely to occur. That’s a long way, though, from the type of pre-cognition portrayed in Minority Report. It only works for relatively predictable crimes such as burglary or auto theft, and it doesn’t tell you anything about who might be planning the heists.
Web search data, by contrast, contains information about specific individuals’ thoughts and plans. In theory, Google or Ask.com could have flagged Okrzesik’s search queries as suspicious and sent the cops her device’s IP address. In the Hollywood script, a vigilant officer would notice the alert, rush to the scene, and knock on the door just as Mensch’s assailants were about to do her in.
In reality, there are a few obstacles that scenario. For starters, police would need instant access to the search data and a way to connect it to a physical address. These days they usually get electronic records only after a crime has been committed and they’ve built up enough evidence to obtain a warrant. They use the data not to prevent crime but to build their case for arrest and conviction. In last year’s high-profile Casey Anthony trial, for instance, prosecutors told the jury they’d searched Anthony’s computer and found 84 queries related to “chloroform” in her browser history, corroborating their theory that Anthony had used the chemical to subdue her 2-year-old daughter Caylee before killing her. (Anthony was acquitted of the killing, and it later turned out that the term had been searched just once—and Casey Anthony’s mother took the stand to say she was the one who searched for it. She said she was trying to look up “chlorophyll” to see whether plant matter was dangerous for her dog to eat.)
Law enforcement agents do sometimes monitor communications in real time, as when they listen in on a suspect’s phone conversations. But federal privacy laws require a special wiretap warrant for eavesdropping, obtainable only after police have probable cause to believe an individual is guilty of a crime. (A 2008 law that allows warrantless wiretapping under certain circumstances has been appealed to the Supreme Court.) So even if it were technically feasible, police wouldn’t be allowed to monitor everyone’s phone conversations for suspicious words or phrases. The same ostensibly holds for monitoring people’s email, text messages, and Web browsing.
As for Web searches, police probably can’t require a company like Google to share its data with them without good reason, legal experts say. But unlike phone conversations, emails, and text messages, search queries aren’t protected from voluntary disclosure to authorities, notes Orin Kerr, a computer crime expert at George Washington University. When you pick up the phone to call a friend, the reasoning goes, you’re communicating with that friend, and the phone company is a third party that doesn’t have a right to eavesdrop. But when you type a query into Google’s search bar, you’re communicating directly with Google. That makes Google the “end user” of your information, and gives it the legal prerogative—at least in theory—to share that information with anyone it likes, including the police or the FBI. Kerr calls it a hole in the country’s privacy laws and has called for it to be patched.
In practice, it’s unlikely Google would do such a thing unless it felt compelled to. Asked about the company’s policies on sharing information with law enforcement, spokesman Chris Gaither told me only that Google complies with valid legal processes, takes users’ privacy seriously, and tries to notify users when it gets requests for their data. And when it does get such requests, it tries to make sure they’re tailored as narrowly as possible. The fact is that, in the absence of a law requiring it to share search information with the government, Google has more incentive to protect its customers’ privacy than to serve as an ongoing informant for the cops.
But if the idea of Internet companies sharing users’ seemingly private information with law enforcement sounds far-fetched, consider that the House of Representatives recently passed a bill to explicitly legalize and promote just that behavior. The Cyber Intelligence Sharing and Protection Act, or CISPA, would encourage the free flow of information on “cyber threats” between the government and major Web firms. Under the law, those firms would be immune from lawsuits arising from the sharing of such information. Several major tech companies, including Facebook and Microsoft, endorsed the bill. After an outcry from privacy groups, though, the Obama administration threatened a veto, and the bill has not been introduced in the Senate.
There are, though, other types of search data that companies already share with federal officials in real time. For instance, Google uses its analytics to report flu trends to the Center for Disease Control and Prevention. It anonymizes the data to make sure it can’t be traced to individual users, but the precedent is still one of instantaneous sharing with the government. In the U.K., meanwhile, legislation is afoot that would require Web companies to monitor users’ searches for terms indicating that they might commit suicide. Such searches would presumably produce an alert to law enforcement, who could then try to intervene to save the user’s life.
Even if Google—or Ask.com or Microsoft’s Bing—did come to an understanding with American law enforcement, there would still be the practical problem of sifting the useful data from the noise. As Casey Anthony’s mother would tell you, there are plenty of reasons to search for chloroform (or chlorophyll) other than to plan a murder. Likewise, any police department that tried to track down all the searches for “how to dispose of a body without getting caught” would quickly find itself overwhelmed. And in Okrzesik’s case, the poor grammar in “chemicals to passout a person” might have kept it off the radar of even the most sophisticated monitoring algorithm.
Yet the next three Google searches on Okrzesik’s phone—“ways to kill people in their sleep,” “how to suffocate someone,” and “how to poison someone”—seem to clearly indicate that someone has a strong curiosity about how to kill someone. One can also imagine other searches—say, a series of queries about the ingredients used to make anthrax—that law enforcement agents might like to know about.
Unlike in Minority Report, such search data probably wouldn’t be enough to justify a search warrant on its own, let alone an arrest and conviction. But David Sklansky, a criminal law professor at UC-Berkeley, says it could constitute the reasonable suspicion needed to pull someone over or stop him on the street. And police don’t need any reasonable suspicion at all to knock on someone’s door and ask what she’s up to, provided the person agrees to talk.
Most of the time, even these seemingly incriminating searches would probably amount to nothing, like a burglar alarm going off in a house (a false alarm more than 95 percent of the time). It’s also possible that if it becomes public knowledge that cops and search engines are collaborating, an Internet mob could start mass-searching “ways to kill people in their sleep,” overwhelming law enforcement with phony queries. But in rare cases, it’s conceivable that search sleuthing could lead to saved lives—perhaps a great number of saved lives, in the case of a terrorist attack. That prospect, however slim, may be enough to convince search engines to at least explore the potential for increased collaboration with authorities. Murderers and the morbidly curious, be warned.