Will Bitcoin Ruin El Salvador?

Listen to this episode

Lizzie O’Leary: I wonder if you could tell me the story of Latif Fisher.

Speaker 2: She lives in Mississippi and went to the hospital in 2017 because she was suffering a miscarriage.

Lizzie O’Leary: That’s Lilly Hay Newman, a senior writer at Wired who covers information security and digital privacy letters. Fisher appeared to be fairly late in her pregnancy. MTV said her fetus seemed to be more than 30 weeks old and medical personnel were suspicious of Fisher. They reportedly gave her medical records to the police who began investigating.

Advertisement

Speaker 2: And she turned over access to her phone. And on that device, law enforcement found search history where she was searching for information about abortion and abortifacients. Because of that finding and some other factors, they indicted her with second degree murder charges.

Lizzie O’Leary: The charges against Fischer were eventually dropped, but Lilly says her story, especially the role technology played, gives us a window into what digital life might look like if the Supreme Court overturns Roe v Wade. We all leave digital trails. Sometimes they reveal intent, but often they just say that we were doing research or looking up some facts.

Speaker 2: In a post Roe America, it could reveal that someone is seeking an abortion or contemplating an abortion. But it could also just reveal natural curiosity. Your first thought is, Well, what about all those people sending totally normal, reasonable texts, having totally normal Google searches, just about their reproductive health or their questions or their thoughts, and suddenly all of that potentially being criminalized.

Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Today on the show, Lilly takes us through the world of online privacy after Roe, what it could look like and how to think about your own digital footprint. I’m Lizzie O’Leary and you’re listening to what next? TBD a show about technology, power, and how the future will be determined. Stick with us.

Lizzie O’Leary: Before I talked to Lily, I read a story of hers with a pretty startling headline. It said The surveillance state is primed for criminalized abortion. And I wanted to unpack that a little bit. Both the mechanics, but also the broader philosophy behind it, because it’s something that privacy advocates have warned about for decades. Long before the draft, Supreme Court opinion overturning Roe was leaked.

Advertisement

Speaker 2: It’s conceptual, the idea that when you establish a surveillance state or expansive mechanisms to surveil the public, that then as life happens and as you know, history happens and new things are criminalized or people want things to be criminalized. That apparatus just exists and anything can slot in very easily for that apparatus to be deployed for that purpose. So. Once the underlying infrastructure is there, the delay to be able to surveil people about that thing is less and less.

Lizzie O’Leary: What is that underlying infrastructure?

Speaker 2: Right. A good example is Immigrations and Customs Enforcement. The agency in the United States, there’s been a lot of research on and some recently describing how that agency which manages border crossings, you get your passport stamped after vacation. You know, we also interact with TSA in the United States like that. These agencies have a much more expansive surveillance dragnet and ability to conduct bulk surveillance than one might think. Both of those agencies have expanded their powers to conduct device searches, to attempt to demand that they search your device during a border crossing or during a stop.

Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Speaker 2: So going back to what we were talking about before, that, you know, potentially revealing the contents of your digital life, what you’ve been searching, but also who you associate with, who you talk to, who you’re connected to on social media, what groups you’re in, all sorts of things like that. Some folks have the privilege in life to have never even thought about those powers and did not even realize that, you know, those agencies have those surveillance powers. And this ties into, like, national security and terrorism implications. Those agencies also work with other agencies who have pioneered data sharing between many law enforcement organizations.

Lizzie O’Leary: These are often known as fusion centers. They can spring up after a high profile crime like the school massacre in Uvalde, Texas, where multiple law enforcement agencies are working together.

Advertisement

Speaker 3: We’re working with our fusion center and our Real-Time Crime Center to monitor any potential threats that could occur.

Lizzie O’Leary: But they can also be used as ongoing data sharing operations between local, state and federal agencies.

Speaker 2: Data is kind of crossing these boundaries in ways we might not expect. It’s sort of classic example being Department of Motor Vehicles data, your photo from your driver’s license, all your information that’s on your driver’s license, your license plate. You can see how law enforcement would say that all of this data is helpful in many types of investigations.

Lizzie O’Leary: Privacy advocates also talk to you about a couple of things that I kind of want to delve into on a granular level. Keyword search warrants and geofencing. And I wonder if you could explain how those work, but also why they might be really important in a reproductive rights context.

Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Speaker 2: Keyword search warrants are like warrants to find out what someone’s been searching for. Right. So that really ties in to what we’ve been saying. But it’s important to think about because. Right. You can find out that information from someone’s own device and look at their history through their own account or, you know, access to their account or their device through the person. But you can also go to the tech companies and say, you know, we have a warrant to find out about this person’s search history before widespread discussions of criminalization of abortion in the United States.

Speaker 2: This is why search engines like DuckDuckGo exist because of, again, an abstract, conceptual premise that perhaps people it’s not a feature to have a search history or a record of all your searches stored somewhere. And perhaps you don’t want a tech company to have all of that data sitting on a server somewhere. It is part of like this concept that, you know, you want to protect your data before you ever know what you’re protecting it against. And it goes to a point that people will often say, Well, I have nothing to hide. You know, it’s not misguided. It’s not naive. But the premise of privacy advocacy or of, you know, this idea of preserving privacy digitally is you don’t yet know what you may want to hide.

Advertisement

Speaker 2: You also ask about geofence warrants. The concept here is that law enforcement and say, you know, people who had devices that were physically in a certain area at a certain time, what devices were there and what all was going on? What were those devices doing?

Lizzie O’Leary: Geofence warrants were used during the unrest after George Floyd’s murder in Minneapolis. An AutoZone store was set on fire and police used a Geofence warrant to compel Google to provide account data on anyone who was near the store.

Advertisement
Advertisement
Advertisement

Speaker 4: The search warrant obtained shows a police officer is asking Google for location history data from devices that reported a location within the geographical region bound by coordinates, dates and times.

Advertisement

Speaker 2: Again, we can see how this applies to something like a reproductive health clinic or a clinic known to provide abortions. Law enforcement might want to have just a rolling Geofence warrant on that location. Right. But again, an example of a mechanism that existed before all of this came up. Right.

Lizzie O’Leary: The big tech companies, which have so much data on us, tend to publish transparency reports where they say what kind of requests like warrants that they got from law enforcement. But then there are commercial data brokers who are in the business of buying and selling our information without the same kind of disclosures. I saw a story recently that that Motherboard advice was able to buy the location data of people who had visited abortion clinics from a data broker. Are there any kind of protections on that kind of data? Or if you’ve got the money, can you just buy it?

Advertisement

Speaker 2: Yeah. Currently you can just buy it. So and that was excellent reporting and an example of a situation where because of that reporting, that specific data broker then said, okay, we’re not going to collect this type of data any more because we don’t want to have it. But not all data brokers would say that, you know, not everyone will make the same determination. A lot of the advice is about reducing the footprint as much as possible, which I just want to be very clear, isn’t always feasible. A lot of data minimization that can really help and is great is challenging for folks to execute.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: I wanted to ask you about that because, I mean, even as someone who is fairly cautious and fairly well-informed about my own data footprint, I make choices all the time out of convenience or necessity. I wonder if there are, I don’t know, a couple of greatest hits or top three practices that. You know, reproductive rights advocates. Told you about or that you’ve seen that don’t completely shut down somebody’s life, but might help protect them.

Advertisement

Speaker 2: Yeah, there are some. And I want to temper my dark mood here with some optimism, because there are some things that are pretty easy to do that go a long way. One of them I’ll start with is using a search engine like DuckDuckGo that doesn’t log your searches right away. That’s that’s a good one. And it speaks to what we were talking about before, where much more difficult or hopefully impossible for law enforcement to say you were searching for abortifacients or you were looking for information on abortion clinics if there’s no record of those searches. So that’s a good one. It’s not just abortion stuff or reproductive rights. These are these are good ones in general, using an end to end encrypted messaging service.

Speaker 2: At Wired, we particularly recommend Signal. Another crucial thing that everyone recommends and that I would recommend is turning on auto deleting messages. Signal offers that at a bunch of time increments from just a, you know, a couple of minutes up to a week. WhatsApp, I believe, only offers one week, but whatever service you’re using, whatever they offer, take them up on it and just turn it on. And the thing with that is just turn it on right now while you’re listening to this podcast, because once you’ve said the thing you would want to delete, you can’t turn it on after that or right then, so just turn it on now.

Advertisement
Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Well, we talked about this a little bit when we were talking about law enforcement, but these sort of best practices, do you think of them as particularly relevant to people who live in places where abortion might be automatically criminalized if Roe was overturned? If you live in a state that has a trigger law or does this apply more broadly? You know, if I live in California, but then I visit Oklahoma and do some Google searches there.

Speaker 2: I think, you know, immigration is an important example because we already have the example of things like sanctuary cities. And we’ve seen in recent years, lawmakers and federal officials say things like threatening that a sanctuary city won’t get funding for certain things. So we’ve seen this stuff really weaponized already.

Lizzie O’Leary: When it comes to data, especially around reproductive health and abortion. It’s really hard to know how this is going to play out.

Speaker 2: Your state may be sharing information with their state that could provide details on that person’s movements in your state and what they were doing while they were there, simply because of these pre-existing data sharing relationships about other things. States are maybe needing, I would argue, needing to start thinking about, well, what actually are the implications of these data sharing programs and what are the potential knock on effects or unintended consequences if your priority is, say, terrorism investigation or detection or something like that? What are the implications for lots of other topics, lots of other behaviors or actions that your citizens and your state might take?

Lizzie O’Leary: There is obviously something that feels small bore talking about kind of individual personal measures when discussing systemic issues and, you know, mega-corporations. But in your coverage and in your history of covering these issues. Do you see any movement toward kind of a push for broader online privacy protections? California, you know, has a privacy law that does not seem to have been widely copied. But I wonder, is that going anywhere?

Speaker 2: I do think there is more fluency about these ideas. I mean, all of you listening have an instinct about what I’m talking about already. Right. And what we’re talking about today. So I think that in itself has come a long way. I think that, you know, there’s more and more evidence that having some type of parameters around how we deal with folks personal data is necessary and that there should be legislation about these things and that it can be implemented even in sort of imperfect ways and then refined.

Speaker 2: And like you’re saying, states like California are doing that. You know, in the U.S.. So, yes, I think there is progress there. But I don’t know. It’s tough. And if you’re an originalist, I can’t imagine you’re going to have a lot of interest in something that didn’t exist in 1776.

Speaker 2: So I would argue that the concepts are the same and that, you know, all of this is sort of technology agnostic and whether your technology is paper or computers or retina scanners or brain to brain interface or whatever that, you know, the actual privacy concepts are enduring and are transferable or sort of translatable. It’s perhaps telling that we haven’t gotten very far yet, and we’ve already desperately needed privacy, federal privacy legislation in the United States for many years now.

Lizzie O’Leary: Lily Hey, Neumann, thank you so much for talking with me.

Speaker 2: It’s my pleasure to be here.

Lizzie O’Leary: Lily Hanuman is a senior writer for Wired. That is it for the show today. TBD is produced by Evan Campbell. Our show was edited by Tori Bosch. Joanne Levine is the executive producer for what next? And Alicia montgomery is vice president of Audio for Slate. TBD is part of the larger what next family, and we’re also part of Future Tense, a partnership of Slate, Arizona State University and New America. We will be back next week with more new episodes. I’m Lizzie O’Leary. Thanks for listening.