When Meta Tells Law Enforcement About Your Abortion
Lizzie O’Leary: Last spring, police in Norfolk, Nebraska, got a tip that a pregnant teenage girl had miscarried and that she and her mother had secretly buried the stillborn fetus. After an investigation, the police charged Jessica Burgess, who was 41, and her 17 year old daughter Celeste, with removing, concealing or abandoning a dead human body a felony. It was an upsetting case. Celeste reportedly said she’d given birth unexpectedly in the shower and said her mother helped her put the fetal remains in a bag, drove them out of town and bury them. There was also some evidence the remains had been burnt.
Lizzie O’Leary: But this is where the case takes an important technological turn. According to court documents, Celeste had been roughly 28 weeks pregnant, and when the detective on the case asked Celeste to pinpoint exactly when the miscarriage occurred, she turned her Facebook account because she’d been messaging her mom that night at exactly 3:38 a.m.. That gave the detective an idea to serve Facebook with a search warrant to get Jessica and Celeste’s messages. And what he found dramatically changed the case.
Jessica Burgess: Back in April, prosecutors say Burgess ordered abortion pills on the Internet for her then 17 year old daughter, Celeste Burgess, to take to abort her fetus.
Lizzie O’Leary: Celeste and Jessica had been using Facebook Messenger to discuss getting and taking abortion medication. And in Nebraska, abortion is banned after 20 weeks. Prosecutors took the extraordinary step of charging Jessica Burgess with two more felonies performing or attempting an abortion on a pregnancy at more than 20 weeks and performing an abortion as a non licensed doctor. Even though the charges came before the Supreme Court overturned Roe v Wade, Facebook’s involvement, which was made public in August, made the case national news.
Jessica Burgess: Again with a new online protest and boycott of Facebook. The latest call for people to delete their accounts comes after it was revealed the Palo Alto Company handed over user personal messages to law enforcement, which then led to charges related to illegal abortion.
Lizzie O’Leary: But reporter Johana Bhuiyan, who covers tech and surveillance for The Guardian, was expecting a case like this.
Celeste Burgess: I actually did not think it would happen as quickly as it did, but I was waiting to see how quickly police would seize on the opportunity to obtain information about or charge and prosecute people for abortions using their private data that they leave on tech platforms.
Lizzie O’Leary: A pretrial hearing for Celeste’s case is scheduled for later this month, and Jessica’s is set for next month with jury trials to follow.
Lizzie O’Leary: Today on the show, now that Roe is gone, will there be more cases like theirs? I’m Lizzie O’Leary and you’re listening to what next? TBD a show about technology, power and how the future will be determined.
Speaker 4: Stick around. Wow.
Lizzie O’Leary: According to court documents, a Norfolk detective served Facebook or Meta with a search warrant last June, and the company handed over information about both Celeste and Jessica’s accounts, which included their messages to one another.
Celeste Burgess: The Facebook’s response when I was reporting at this story was that, you know what every tech company says when I asked them about law enforcement requests, which is we review the request really closely. We make sure it’s not overly broad and we only give the information that we are absolutely compelled to give. Companies don’t really have a lot of leeway to fight these. These are legal subpoenas and warrants. So Facebook handed over this data.
Lizzie O’Leary: And what these chats seem to show is the mother and daughter pair, you know, procuring abortion medication.
Celeste Burgess: Well, the chats show that they were talking about getting, you know, quote unquote, that thing out of her, talking about getting medication over the Internet.
Lizzie O’Leary: It seems clear they’re discussing a two pill regimen for medication abortion. In one exchange, Jessica says, yeah, the one pill stops the hormones and then you got to wait 24 hours to take the other. Celeste replies, okay. The messages are simultaneously intimate and significant, while also tossed off in everyday digital shorthand.
Celeste Burgess: I mean, why wouldn’t you expect this to be private, right? You’re having like a very private conversation directly with your daughter. There is no real expectation that literally any law enforcement will be able to look at it and pretty easily to write. They just fill this piece of paper out, send it to Facebook. Facebook will review it and say whatever. And in the end, ultimately, in the vast majority of cases, hand that information over.
Lizzie O’Leary: Soon after the Dobbs decision, when the Supreme Court overturned Roe v Wade, many tech workers pushed their employers to protect data around abortion like location data or search histories. Companies like Google and Facebook say they will try to protect some things like health information, but in criminal investigations, it often doesn’t matter. In this case, the cops weren’t specifically asking for information about abortion.
Celeste Burgess: So those types of concessions that are really narrowly curated and tailored to specific types of data are not always helpful because in the warrants, police are not always saying that they’re seeking health data. And in fact, this wasn’t health data. You know, this was their messages. He was just asking for Facebook information. He also asked for pictures that the two women were tagged in. Right. That has nothing to do with health data. And yet it can tell you so much about your health and your journey to seeking reproductive health, Saying that you’re only going to protect a particular type of data actually doesn’t even protect that type of data because what health data is Facebook really collecting about you? It’s it’s really just your intimate connections and your conversations, the things that you like, the things you shop for. You know, it’s all of that can still paint a very similar picture.
Lizzie O’Leary: Well, yeah, one of the things that’s so striking in these messages between Celeste and Jessica Burgess is that they are seemingly talking about this medication, but then also talking about things like, oh, I’m I friended so-and-so’s mom. And did you hear that thunder like it? It’s this really kind of everyday exchange.
Celeste Burgess: Yeah. I mean, like I, I can’t think of a single person who would feel comfortable regardless of whether they’re doing something that could be the root cause of a police officer charging you with a crime. Like who wants their messages revealed. I could. I would die like I would I would simply rather perish than have my messages on the Internet, even if it was, like, very innocuous and totally fine. Right. It’s just true. It’s like the everyday, like conversations of a human being should not be, you know, visible to anyone.
Lizzie O’Leary: How much do we know about what kind of requests platforms get from law enforcement and and how often they comply?
Celeste Burgess: Tech companies publish a transparency report about every six months where they say, here’s how many government requests that we’ve gotten that we’re allowed to tell you that we’ve gotten, that don’t come with gag orders or nondisclosure agreements. Here’s the percentage of those requests that we respond to with some level of information. And they also categorize it by the type and category of legal requests it is. So there’s warrants, there’s subpoenas. I believe Google is even like breaking out how many reverse search warrants they’re getting, which is when there’s not really a suspect in mind. And police say, hey, can you give me all of the information on everyone who searched for this particular term or who happened to be in this location at this time? So the transparency reports are pretty detailed. I mean, I can’t think of Facebook’s off the top of my head, but Google’s I mean, it’s in the tens of thousands every six months they get. So, so many legal requests and respond to between 80 and 90% of them with some level of data.
Lizzie O’Leary: It’s rare that a tech platform refuses a request from law enforcement. The most famous example is clearly Apple’s refusal to unlock the suspect’s iPhone after the 2015 San Bernardino shooting. Joanna says that kind of thing almost never happens.
Celeste Burgess: Twitter pushed back in a particular case where a federal agency was asking for information about an account that was sort of a parody of, I believe at the time it was at DOJ. I think it was like a parody account of the DOJ. Twitter said no. I know Facebook has also pushed back on several instances. It’s definitely not the rule. It’s absolutely the exception. I think it’s really hard and expensive to push back on these cases. And they get so many. Right. As much as they are saying that they they they look into every single one. I mean, if you have like tens of thousands every six months, there’s only so many that you’re going to be like, okay, this might be overbroad. It’s worth us pushing back on it.
Lizzie O’Leary: Let me push back for a sec because, you know, a detective or some other agency working on a case is going to say, well, wait a minute. This information that is, you know, possessed by Facebook or Google or what have you is helpful to me in solving crime. Why does that concerning?
Celeste Burgess: It’s concerning because, one, we have a constitutional right to reasonable searches and seizures. We have a protection against unreasonable searches and seizures. And we have never been in a situation before. Not never, but in the last, you know, ten to 15 to 20 years, we are in this sort of new era where a warrant, you know, it used to be I’m walking into your house, I’m going to look through the files in your home. You get to see what this the police officers and law enforcement are looking through in your home.
Celeste Burgess: When it comes to data that is stored, you know, on tech services and things like that, one, there’s very little transparency because often they come with gag orders or non-disclosure. But even when there is transparency, you really don’t have a ton of time or leeway to fight off that subpoena yourself. And then you also don’t know what information and data they’re looking at and they’re getting. Right. This is your entire lives. It’s not, you know, the contents of your desk or the contents of your home. It is every single thing that you do on a day to day basis unless you know you are better about your privacy settings and some people. So that’s that’s a big part of it. It’s like it is truly your whole life.
Celeste Burgess: The other thing is what is illegal, right? Like who gets to decide what is illegal and what is not illegal. The Dobbs decision is a great I think that’s the biggest lesson from this, is that what you think is illegal today or legally protected today may not be legally protected tomorrow. And oftentimes what happens is the thing that is legally protected are people’s civil liberties. And it’s the people of color, black and brown people, particularly black and brown women who are disproportionately impacted by these surveillance systems. So today it’s them, right? Today it’s people like me. Tomorrow, as we’ve seen with Dobbs, it’s all of you.
Lizzie O’Leary: When we come back, could anything push the platforms to protect users privacy? How does the Dobbs decision figure into this? Because you did have after the court’s decision, a bunch of these companies saying we will support our employees right. To an abortion, you know, sounding like they were taking these stands. And I wonder kind of how that shakes out when they are faced with a law enforcement request.
Celeste Burgess: A lot of the companies had things that were really similar to Facebook, which is like we’re going to protect health data. Google said that if you’re searching for abortion clinics, they’ll mask that location. Again, the problem is police are not sending subpoenas saying, hey, this is an abortion investigation or, hey, we’re looking for health data specifically, They want your messages, they want your emails, they want your location data. And none of that falls under the categories of things that Google, Facebook and all these other companies are saying that they will protect and mask.
Lizzie O’Leary: Johanna spoke with several researchers who tested whether an Android phone would actually mask abortion clinic search data, which is what Google had promised. But one researcher found that her phone gave her away.
Celeste Burgess: In one instance. While it didn’t say Planned Parenthood on on her Google Maps route, that showed where she went like a couple of weeks ago, there was a pin right where the Planned Parenthood location was showing that she did stop there. You know, Google basically said, well, it’s because Google Maps didn’t detect that she went to the Planned Parenthood, that they detected, that she went to the locations around her. But her search history still said Planned Parenthood, like there’s there’s just so many other details and pieces of information and data that will point to the fact that this person was seeking reproductive health care. And it basically doesn’t matter that Google is not showing that she actually stopped at Planned Parenthood.
Lizzie O’Leary: I’m I’m thinking about the incentives that platforms do and don’t have to protect user privacy. You know, there has been a lot of conversation about moving toward end to end encryption. That is something Facebook has talked about for a while. How much would that change the picture vis a vis law enforcement?
Celeste Burgess: It would change it drastically. The only way to protect user data in a surefire guaranteed way is to not collect it in lieu of that. And making all of that data end to end encrypted would do effectively do a lot of the same thing, right? Like tech companies will no longer be storing that data, that information, and that those messages would only be between you and that person that you’re speaking with. And there is no way or at least no easy way for anyone else to access the data. The company doesn’t even have it. Companies are motivated to do it to some degree, right? Look at Apple, for instance. Right. Obviously, there’s room to grow. There’s things that they need to do that to improve their privacy and their data protection. But they are kind of the gold standard of the industry right now. And it is a huge marketing, you know, point for them. They really push the fact that they are the most private and most secure company and it works for them.
Celeste Burgess: Again, a lot of advocates and activists believe that they could do a lot more, but they do continue to iterate on the privacy and security systems that they have. They add more encryption like constantly. They just announced a suite of new encryption services a couple of months back. And Facebook, again, like you said, to his credit, is also now testing end to end encryption for messaging. They’ve expanded that test to a wider group of people right now. But what I’d heard from them is like it’s pretty difficult. One of the pitfalls from a user experience perspective of end to end encryption is like if you delete all your data or if you delete all your messages, like Facebook can’t help you recover it, right? Like recovering that data and information is no longer possible through the company. So there’s like small things like that that are to some people might be like a real trade off. But I think the benefits of end to end encryption like far outweigh any of these potential risks.
Lizzie O’Leary: But then of course, doesn’t that in some way under undermine a company’s business model? Right. If it’s if it’s built on advertising and and, you know that advertising is tailored to people’s data that makes it hard for them to to do what they want to do monetarily.
Celeste Burgess: Yes. So that is a thing with companies whose business model is largely advertising is that there is not actually a overwhelming financial incentive to stop collecting your data and to encrypt all data. I reported on an internal Google effort from employees Post Dobbs, where employees were demanding concessions around protecting health care information. And I asked some of the folks who are leading that effort, you know, well, as we’ve seen, the Jessica Burgess case had just come out and was like, as we’ve seen, police are not asking for health data. They’re asking for your messages or asking for other personal data and information. Would you like consider adjusting your demands to reflect that reality? And their answer?
Celeste Burgess: Was it fair and completely realistic and practical, which is that, you know, as an employee of Google, I understand that Google cannot survive as a company without collecting and using this data. Even folks who are. Working to protect folks who are seeking abortions, like even they realize or sort of concede that the way that these businesses exist today are counter to data privacy and data protection.
Lizzie O’Leary: One of the things that I’m curious about, and I know that this is a focus of a lot of your reporting is that post? DOBBS Maybe it’s fair to say that people who had previously thought data protection isn’t that relevant to me. I don’t have anything to hide, are thinking about it in a different way. I mean, I’m specifically thinking white women are thinking about it in a different way. I wonder if you’re starting to see the realization around that shift.
Celeste Burgess: It is great that people finally care that tech companies have all this data on you, and law enforcement can access it easily. That realization is great and it is helpful toward the overall movement. But a lot of the efforts that have resulted from that realization are not necessarily that productive toward the larger cause of protecting people’s data, because, again, it’s like we’re going to ask these companies to protect our health care data. Nobody’s asking for health care data so it doesn’t protect people seeking abortions. And then on top of that, you kind of give them a pass for all of the other data that they’re collecting on us and that they’re handing over to law enforcement.
Lizzie O’Leary: And of course, tech platform and social media data are used by law enforcement in all sorts of cases.
Celeste Burgess: Google has handed over information to ICE before. I reported on several cases related to that. There isn’t this sort of proactive effort to protect other kinds of data in the way there is to protect reproductive health care data, which I think is good and it should be protected. I think it’d be great if the realization this more mass popular realization that this level of data collection can be harmful to a lot of people, resulted in broader privacy regulation that protected everyone.
Celeste Burgess: Right now, California did introduce a bill that actually does do that, where it’s like the initial intention of the bill is to protect folks seeking abortions. It would also just completely ban tech companies from responding to what I call or what they call reverse search warrants, which is geofence warrants and keyword search warrants, which basically basically is like, give me all of the people who search for this or give me all the people who are in this location. And that’s that is actually a great example of a bill that would also protect folks seeking abortion, but is a more broad and general bill that would, you know, increase data privacy generally.
Lizzie O’Leary: Is is the solution something like a data protection law like exists in California or in Europe? Yeah, the solution.
Celeste Burgess: Is a federal data protection law, as always. Like that would be the very real answer to all of these, like really robust data protection laws. But even like state laws, you don’t do enough, right? And like, you can dictate or try to regulate data in a particular state, but data doesn’t like exist within the geographical boundaries of a state, right? So it’s like very difficult to really control what a tech company is doing. And, you know, with your data across the country.
Lizzie O’Leary: Going back to the case of Jessica and Celeste Burgess as this case moves forward toward trial. Are you expecting more of these?
Celeste Burgess: Yes. This actually happened really early because the initial request happened before the Dobbs decision. So this came out when it did, and it got a lot of the attention that it did because of the Dobbs decision. But, you know, at that point, you know, in speaking with public defenders and folks who are really focused on the use of surveillance tech in in cases, they thought it was too early, like we would see more and more of these as police like, you know, it takes a long time for police to find people to charge people, take it to trial. So I think, you know, in the next couple of months, the next year or so, we may begin to see more of this unless, you know, by some miracle laws are passed that protect us. And there are there are renewed efforts to pass another federal privacy bill. But we’ll see what happens unless that happens. I think it’s safe to say that we should expect a lot more of this.
Lizzie O’Leary: Johana Bhuiyan. Thank you so much for talking with me.
Celeste Burgess: Thank you for having me.
Lizzie O’Leary: Johana Bhuiyan is a senior reporter covering tech and surveillance for The Guardian. And that is it for our show today. What next? TBD is produced by Evan Campbell. Our show is edited by Mia Armstrong Lopez. Alicia montgomery is vice president of audio for Slate.
Lizzie O’Leary: Plus it helps support us. Just head on over to slate.com. What next? Plus, to sign up, you’ll get all your slate podcasts ad free. All right. We’ll be back on Sunday with another episode. I’m Lizzie O’Leary. Thanks for listening.
Lizzie O’Leary: TBD is part of the larger Next Family and also part of Future Tense, a partnership of Slate, Arizona State University and New America. And if you are a fan of what we’re doing here at this show, I have a request for you. You can join Slate.