S1: Sleep plus members. I’m here to remind you to take the sleep survey. It’ll be open through April 1st. This is your chance to tell us what you think about Sleepless Ancelet. It’ll only take a few minutes and you can find it at Slate.com slash survey.
S2: Hey, everyone. This is Henry Gravois. I’m in for Lizzi this week. Before we get started, I just want to give you a little heads up. I am recording this in my closet in my apartment in Chicago, so I apologize for any inconsistencies with the sound. I guess this is the new normal.
S3: Remember, if you can last Friday. Thank you very much.
S4: Beautiful day in the Rose Garden.
S2: Major League Baseball had just suspended its season. Broadway theaters had just closed their doors the previous day, had seen Wall Street’s worst rout since 1987. And Donald Trump for the first time appeared to be taking the Corona virus seriously.
S4: Today, I’d like to provide an update to the American people on several decisive new actions were taken.
S5: I was lying in bed listening to the press conference and Trump was speaking.
S2: That’s Mason Marks. He’s a lawyer who teaches at Gonzaga and he studies health, law, technology and privacy.
S5: So Trump was standing in the Rose Garden of the White House, and he was flanked by a group of CEOs from some of the largest retailers like Wal-Mart and CBS.
S6: And he was mentioning how some of these retailers were going to partner with the administration to offer drive through testing for the novel coronavirus.
S4: The goal is for individuals to be able to drive up and be swabbed without having to leave your car.
S5: But then he just mentioned Google kind of nonchalantly, and that really caught my attention. So I I kind of like sat up in bed.
S4: I want to thank Google. Google is helping to develop a website to determine whether a test is warranted and to facilitate testing at a nearby convenient location.
S7: Now, Mason was paying attention. He’d written last year for Slate about a Google partnership that gave the company access to 50 million health records.
S2: Google, Mason argued then, was working towards a, quote, unrivaled consumer health surveillance empire.
S6: Google, according to Trump, was going to create a Web site in which people could enter their symptoms and their medical information, and it would direct them, if necessary, to the drive through testing. And I thought that was really strange.
S7: As it turns out, Trump didn’t have it quite right. It was a company called Verilli. Part of Google’s parent company, Alphabet.
S2: And they were working on something, but not nearly at the scale that the president had implied. Tonight, mounting questions about the program the president promised the American people.
S8: The country is being misinformed. I think it’s fair to say the country is being lied to about this. When he said that, everyone kind of made fun of him.
S9: They said, you know, he did. Trump had talked up this big thing that wasn’t actually happening. But that’s not your take, right?
S6: No. I think what Trump announced is exactly what is happening. The program might not be as far along as he led some people to believe, or at least as much as some people interpreted his announcement.
S2: But I think that his announcement is is very consistent with what is actually happening fairly set up a covert nineteen testing Web site for two Silicon Valley counties. But Mason thinks this site will soon be America’s go to platform for the COVA test. And while Washington can use all the help it can get right now, he thinks this combination of Google and health care should make you think twice.
S10: Not only will they be running the coronavirus testing and taking in data on people’s symptoms and their medical histories and their age and their location, but also the results.
S11: Mason sees this virus as an opportunity for Google to finally crack an industry. They’ve been trying to get into it for over 10 years and they’re not alone. Today on the show. But the Koven 19 pandemic reveals about big tech’s ambitions for health care. Is this a way forward for our country’s broken health care system? Or just another attempt by Silicon Valley to mine our personal lives for data? I’m Henry Goodbar for Lizzie O’Leary. This is what next TBD. Stay with us.
S2: So earlier this week, Verilli Lifesciences launched that screening and testing portal for Silicon Valley residents. Verilli is Google’s sister company. It specializes in health care and biotech. It’s a separate company from Google. But to get information about Corona virus testing, you still need to sign in with your Google account.
S9: While this thing only exists in two counties now, you think that this might become the default system for testing for this virus in the United States.
S6: Yeah, I think that is the intention. And Verilli has been working not only with the federal government, but also with the state of California, with the public health department and with the governor, Gavin Newsom, to develop this platform. And they’ve been very open about the fact that they intend to expand it to the entire state of California and then potentially nationwide.
S12: So let’s step back for a second. What is the point of online screening in the first place? We know that hospitals are in triage mode right now and that these companies that have offered private testing in parking lots are also a bit overwhelmed. Why do we need an online portal and what’s that supposed to do to make the system run better?
S6: So I think the portal in theory, it could be very useful because right now the number of tests is limited and the number of testing sites is limited.
S10: So we do need some way to determine who really should be tested and who perhaps should not. And one interesting feature of the Verilli portal is that if you have relatively severe symptoms and you put that into the portal, it will tell you that it can’t help you and it will tell you to seek medical attention immediately. And that’s been criticized quite a bit in the media. But that might actually be a pretty useful feature, because if someone is actively experiencing shortness of breath and having really severe symptoms, you might want that person go directly to an emergency room to get treated. So if there is an important triage function, they’re determining how to best allocate those limited resources.
S12: So I told Verilli a little white lie and said that I was a resident of Santa Clara or San Mateo County so I could take a look at the website. And once you get into the portal, there is some fine print and it says your coronavirus testing data will never be joint with data from Google won’t be used for research purposes. And it says Google’s access is strictly limited to services like cloud services, security services, data storage, Web site hosting, support functions. So they seem to be going to great lengths to reassure you that this is not going to be used to sell you cough syrup or something like that.
S6: The fine print actually says that they won’t join data that you put into the portal with your data from other Google services like G-mail, like Google Docs, like Google Search without your explicit permission. Imagine your mental state when you’re trying to seek out testing and you think you might be positive for the virus. You log on to this portal, you want to get tested. That’s what you’re thinking about. You’re not thinking about your privacy. You’re not thinking about the data that might be collected by this company and how it might be used.
S12: I guess I don’t really see how that represents a step up compared to, say, CBS, knowing what type of mental health medication somebody picks up the prescription for every month. I mean, I guess from from a regular consumer standpoint, the feeling might be all these companies already have my information anyway. So what’s one more company knowing my coronavirus status?
S6: Well, here’s one difference. So if you go to c_v_s_ or Walgreens and you fill a prescription, that pharmacist has to comply with HIPA. And quickly, can you just explain what what hip is? Yeah, HIPA is the primary law in the United States federal law that provides protection for people’s health information. It does a lot of other things. But at least in this context, what’s most interesting to us about it is that it protects the privacy of your health information. Facebook, Google, Verilli. They are not health care providers.
S2: To be clear, Verilli says it will not join your covered data with your email search and maps data. But Mason says you should still be worried about big tech starting to play doctor.
S6: Google has a very strong interest in getting into the health care sector and pretty much all the leading tech companies are interested in doing this. Facebook, Amazon and Google and Google has really been the most successful out of all the leading tech platforms at getting access to medical records. It’s formed numerous partnerships with hospitals and health care systems in about 25 different states, and it reportedly has access to something like 50 million individual medical records.
S13: These records he’s referring to, they’re part of something called Project NIGHTINGALE. In 2018, Google partnered with a chain of Catholic hospitals offering to crunch patient health records for free as the software company built a database of conditions, diagnoses and prescriptions. And with that information. Ideas for new treatments. When The Wall Street Journal broke the news of the project last year, people were upset, but Mason felt they were missing the point.
S10: A lot of people were asking the wrong questions about Project NIGHTINGALE. They were focused on whether or not Google was complying with HIPA because there was some evidence that employees within the company were getting access to people’s medical records. And I was pointing out that that is concerning. That’s not really what people should be focusing on. People should be asking what exactly is Google going to do with not only the information in the medical records, but what it learns from the medical records and from analyzing them. So it’s not necessarily a huge concern that Google has access to medical records, like you pointed out before. A lot of private companies have access to medical information. What’s concerning is what Google might do with it. Alphabet, at least a trillion dollar company. One of the wealthiest, most powerful companies in the world. And so the Verilli portal could be a very useful public health tool. It could be very helpful. There’s no doubt. But it could also serve as yet another stream of data that is plugged into Google and its sister companies and Alphabet and used to create an unparalleled cache of data and knowledge that remains proprietary and is used to draw inferences about people who are using products in a context that has absolutely nothing to do with health care.
S2: In other words, you think you told them one thing, but they have a whole file on you. Sounded pretty far fetched to me when Mason said it. But it turns out the big tech companies are already using their data to generate health advice. For example, there have been a handful of stories about how Apple watches detected user’s heart problems and one company is ready to save your life with the information you’ve already provided them.
S6: Facebook is very, very involved in public health. Already it has been for a couple years.
S14: And one way that it tries to protect public health is by predicting suicide. And so Facebook has artificial intelligence that scans every piece of content that’s posted to the site. So everything you post to Facebook, whether it’s a video or a status update or even a private message between you and a family member or friend is analyzed by Facebooks A.I. and the A.I. calculates a risk score for suicide for everyone, for every single person. The last time I checked, it was in six different languages around the world. So these are the major languages like English, Mandarin, Spanish. And so every single piece of content is analyzed by the A.I.
S12: So Facebook. So Facebook has a suicide likelihood score for me and you?
S14: Well, it’s it’s it’s not for you per persay. It’s for the piece of every piece of content that you provide to the site. And so, you know, it’s that’s how they describe it. They would tell you that they’re not calculating it for you. It’s for your content. If if the score is high enough, that piece of content will be referred to Facebooks, human content moderators, and they will make a judgment call.
S10: They might call the police and send them to your house and to perform what’s called a wellness check.
S14: And so, of course, at that point, they identify who as long as this happened.
S15: Have people gotten the police called on them because of their Facebook content? This has happened over 3000 times around the world. Wow.
S9: So I was going to ask you, you know, when you were time at Verilli, I was thinking, well, health care in this country is all private anyway. The doctors are private. The pharmacies are, you know, publicly traded companies. But, you know, they’re really the insurance companies are private as well. And so didn’t seem at first glance, I thought nothing new about one more private entity getting involved in health care and directing coronavirus testing. But I guess what you’re saying is that this isn’t just some startup. This is this is linked in to a whole world of data. And what you’re worried about is that the the barrier that they say they’ve erected between those two data sets might fall.
S10: Yeah, this is something completely new where we’re constantly being recorded and analyzed by smartphones, digital voice assistants like Amazon, Alexa, our laptops, wearables like Fitbits and Apple watches. All of these pieces of technology are constantly collecting data from us. I call them digital traces. Some people call it. Digital exhaust and tech companies collect those digital traces, they’re very valuable, they’re a raw material for further analysis with artificial intelligence and they transform those digital traces into sensitive health information. And that’s what Facebook is doing when it’s calculating a suicide risk score for each piece of content, it’s taking information that seemingly may have nothing to do with suicide at all. It’s analyzing it with machine learning, a certain type of artificial intelligence and transforming that non health related data into suicide information.
S16: What they would say, right, is that we are making enormous public health advances and we’re making possible the kinds of predictive technologies like if even one of those three thousand police visits prompted by Facebook content results in somebody getting the help they need and not taking their own life. Doesn’t that make it all worthwhile?
S6: Yes. You’re hitting on something really important. When tech companies and even when some physicians talk about these types of programs, they emphasize the benefits. And there is a real lack of consideration for the risks. So you’ll never hear Facebook talking about the risks of these systems, but there are very real risks.
S12: So it sounds like two new frontiers here. One is that we’ve got these companies operating health services outside of the sort of traditional regulatory sphere that’s governed, you know, the doctors and the hospitals and the Hippocratic Oath and help on all this stuff. And the other is that we’ve got these companies that are collecting this data and doing it in a way that combines it with all this other data.
S14: Yeah. Yeah. And you’ve brought up some good some good points, I think. So when you go to see your doctor, there are many different safeguards in place to protect your privacy. It’s not just HIPA. Like you said, the health care provider takes a Hippocratic oath to protect your privacy.
S10: They also owe you certain duties. They have certain fiduciary duties towards you. The duty of confidentiality, the duty of loyalty and the duty of care. Facebook, Google, Verilli. They are not health care providers. So not only does HIPA not apply, but they haven’t taken a Hippocratic oath. They don’t have fiduciary duties that they owe toward to you. One solution that’s been proposed by a couple of law professors, Jack Balkan at Yale and Jonathan Zittrain at Harvard, had the idea that we should impose similar types of duties on to companies that collect large volumes of data from consumers. So imagine that if Verilli was considered a digital information fiduciary, in that case, it would owe you a duty of loyalty. So it would not be able to use your information in a way that is harmful to you.
S17: First, for someone who is concerned about this and who has not yet arrived at the stage of thinking, well, all my data is out there anyway, you know, what is what is the best practice here? Is there no tech company that is trying to distinguish themselves based on the idea that they will keep your health information safe?
S18: Well, they all say they’ll keep your information safe. Right. And we’ve learned time and time again through many, many different privacy debacles over the past few years that that’s not the case.
S16: Mason, thanks so much for coming on. Thank you. My pleasure.
S19: Mason Marks is a law professor at Gonzaga University School of Law and an affiliated fellow at Yale Law School’s Information Society Project. That’s it for today. But next, TBD is produced by Ethan Brooks. And I’m Henry Gabbar. TBD is part of the larger what next family and also part of Future Tense, a partnership of Slate, Arizona State University and New America. Lizzy, we’ll be back next week and Mary will be back on your feet on Monday. Thanks for listening.