Last spring, police in Norfolk, Nebraska, got a tip that a pregnant teenage girl had miscarried and that she and her mother had secretly buried the stillborn fetus. After an investigation, the police charged Jessica Burgess, who was 41, and her 17-year-old daughter, Celeste, with “removing, concealing, or abandoning a dead human body,” a felony. In the course of the investigation, the detective decided to serve Facebook with a search warrant to get Jessica and Celeste’s messages. What he found dramatically changed the case.
Celeste and Jessica had been using Facebook Messenger to discuss getting and taking abortion medication. In Nebraska, abortion is banned after 20 weeks, and prosecutors took the extraordinary step of charging Jessica Burgess with two more felonies: performing or attempting an abortion on a pregnancy at more than 20 weeks, and performing an abortion as a nonlicensed doctor. The charges came before the Supreme Court overturned Roe v. Wade, but Facebook’s involvement, which was made public in August, made the case national news. A pretrial hearing for Celeste’s case is scheduled for later this month, and Jessica’s is set for next month, with jury trials to follow.
On Friday’s episode of What Next: TBD, I spoke with Johana Bhuiyan, senior reporter for the Guardian, about whether, now that Roe is gone, we can expect to see more cases like the Burgesses’. Our conversation has been edited and condensed for clarity.
Lizzie O’Leary: According to court documents, a Norfolk detective served Meta with a search warrant last June, and the company handed over information about both Celeste’s and Jessica’s accounts, which included their messages to one another. What did the company tell you about this case?
Johana Bhuiyan: Facebook’s response was what every tech company says when I ask them about law enforcement requests, which is: they review the request really closely, they make sure it’s not overly broad, and they only give the information that they are absolutely compelled to give. Companies don’t really have a lot of leeway to fight these. These are legal subpoenas and warrants, so Facebook handed over this data.
Soon after the Dobbs decision, when the Supreme Court overturned Roe v. Wade, many tech workers pushed their employers to protect data around abortion, like location data or search histories. Companies like Google and Facebook say they will try to protect some things like health information, but in criminal investigations, it often doesn’t matter. In this case, the cops weren’t specifically asking for information about abortion.
Those types of concessions that are really narrowly curated and tailored to specific types of data are not always helpful, because in the warrants, police are not always saying that they’re seeking health data. And in fact, this wasn’t health data. This was their messages. [The police officer] was just asking for Facebook information. He also asked for pictures that the two women were tagged in. That has nothing to do with health data—and yet, it can tell you so much about your health and your journey to seeking reproductive health.
Saying that you are only going to protect a particular type of data actually doesn’t even protect that type of data, because what health data is Facebook collecting about you? It’s really just your intimate connections and your conversations, the things that you like, the things you shop for. All of that can still paint a very similar picture.
How much do we know about what kind of requests platforms get from law enforcement and how often they comply?
Tech companies publish a transparency report about every six months, where they say, “Here’s how many government requests we’ve gotten that we’re allowed to tell you that we’ve gotten, that don’t come with gag orders or nondisclosure agreements. Here’s the percentage of those requests that we respond to with some level of information.” The transparency reports are pretty detailed, and Google’s are in the tens of thousands every six months. They get so, so many legal requests and respond to between 80 and 90 percent of them with some level of data.
Tech platforms don’t often refuse requests from law enforcement. The most famous example is Apple’s refusal to unlock the suspect’s iPhone after the 2015 San Bernardino shooting. Why are these cases so rare?
It’s really hard and expensive to push back on these cases. As much as they are saying that they look into every single one, I mean, if you have tens of thousands every six months, there’s only so many that you’re going to be like, “OK, this might be overly broad. It’s worth us pushing back on.”
Let me challenge this for a second, because a detective or law enforcement agency working on a case is going to say, “Well, wait a minute. This information that is possessed by Facebook or Google is helpful to me in solving crime.” Why is that concerning?
It’s concerning because, one, we have a constitutional protection against unreasonable searches and seizures. A warrant used to be, I’m walking into your house, I’m going to look through the files in your home. When it comes to data that is stored on tech servers and things like that, there’s very little transparency, because often they come with gag orders or nondisclosures. But even when there is transparency, you really don’t have a ton of time or leeway to fight off that subpoena yourself, and then you also don’t know what information and data they’re looking at. This is your entire life. It’s not the contents of your desk or the contents of your home. It is every single thing that you do on a day-to-day basis—unless you are better about your privacy settings.
The other thing is, what is illegal? Who gets to decide what is illegal and what is not illegal? The Dobbs decision is a great example that what you think is legally protected today may not be legally protected tomorrow. Oftentimes what happens is the thing that is legally protected are people’s civil liberties, and it’s the people of color—particularly Black and brown women—who are disproportionately impacted by these surveillance systems. So today it’s people like me. Tomorrow, as we’ve seen with Dobbs, it’s all of you.
How does the Dobbs decision figure into this? After the court’s decision, a bunch of these companies said, “We will support our employees’ right to an abortion,” sounding like they were taking these stands. How does that hold up when they’re faced with a law enforcement request?
A lot of the companies said things that were really similar to Facebook, which is like, “We are going to protect health data.” Google said that if you’re searching for abortion clinics, they’ll mask that location. Again, the problem is police are not sending subpoenas saying, “Hey, this is an abortion investigation” or “Hey, we’re looking for health data specifically.” They want your messages, they want your emails, they want your location data, and none of that falls under the categories of things that Google, Facebook, and these other companies are saying that they will protect and mask.
You spoke with several researchers who tested whether an Android phone would actually mask abortion clinic search data—which is what Google had promised. But one researcher found that her phone gave her away.
In one instance, while it didn’t say Planned Parenthood on her Google Maps route, there was a pin right where the Planned Parenthood location was, showing that she did stop there. Google basically said, “Well, it’s because Google Maps didn’t detect that she went to the Planned Parenthood. They detected that she went to the locations around it.” But her search history still said Planned Parenthood. There are just so many other details and pieces of information and data that will point to the fact that this person was seeking reproductive health care.
There has been a lot of conversation about moving toward end-to-end encryption. How much would that change the picture vis-à-vis law enforcement?
It would change it drastically. The only way to protect user data in a surefire, guaranteed way is to not collect it. In lieu of that, making all of that data end-to-end encrypted would effectively do a lot of the same thing. Tech companies will no longer be storing that data. That information and those messages would only be between you and the person that you’re speaking with, and there is no easy way for anyone else to access the data. The company doesn’t even have it.
But then, of course, doesn’t that in some way undermine a company’s business model? If platforms are built on advertising and that advertising is tailored to people’s data, that makes it hard for companies to do what they want to do monetarily.
Yes. So with companies whose business model is largely advertising, there is not actually an overwhelming financial incentive to stop collecting your data and to encrypt all data. I reported on an internal Google effort from employees post-Dobbs where employees were demanding concessions around protecting health care information, and I asked some of the folks who were leading that effort, “As we’ve seen, police are not asking for health data. They’re asking for your messages. They’re asking for other personal data and information. Would you consider adjusting your demands to reflect that reality?” And their answer was fair and completely realistic and practical, which was “As an employee of Google, I understand that Google cannot survive as a company without collecting and using this data.” Even folks who are working to protect people who are seeking abortions concede that the way that these businesses exist today are counter to data privacy and data protection.
Post-Dobbs, many people who had previously thought, “Data protection isn’t that relevant to me, I don’t have anything to hide,” are thinking about it in a different way. Specifically, white women are thinking about it in a different way. Are you starting to see that shift?
It is great that people finally care that tech companies have all this data on you and law enforcement can access it easily. That realization is helpful toward the overall movement. But a lot of the efforts that have resulted from that realization are not necessarily very productive toward the larger cause of protecting people’s data. Again, we’re going to ask these companies to protect our health care data, but nobody’s asking for health care data, so it doesn’t protect people seeking abortions. And then on top of that, you kind of give them a pass for all of the other data that they’re collecting on us and handing over to law enforcement.
What would be a better solution?
The solution is a federal data protection law. State laws don’t do enough. You can dictate or try to regulate data in a particular state, but data doesn’t exist within the geographical boundaries of a state. So it’s very difficult to control what a tech company is doing with your data across the country.
Do you expect to see more cases like the Jessica and Celeste Burgess case in the future?
Yes. It takes a long time for police to find people, charge people, take it to trial. So I think in the next couple of months, in the next year or so, we may begin to see more of this—unless by some miracle, laws are passed that protect us. There are renewed efforts to pass another federal privacy bill. But unless that happens, I think it’s safe to say that we should expect a lot more of this.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.