What Went Wrong With Contact Tracing Apps

Listen to this episode

S1: Gus Hosain has dedicated his career to studying privacy and pushing tech companies to protect users. He’s the executive director of Privacy International in London. If you told him in January that he’d recommend voluntarily giving up some private information to governments, he might not have believed you. But when covid-19 began to spread across the globe, he became obsessed with contact tracing, contact tracing, got a lot of attention and energy in the early days of the pandemic. But since then it’s taken a bit of a backseat for a number of reasons that will dig into. Now, the focus is on face masks so far, distance and quarantines. But Gus, Gus is still thinking about contact tracing.

S2: It’s probably one of the most important aspects of a pandemic response, the fundamental being testing, the ability to understand whether or not somebody does have the virus. But once you’ve done that, the necessary next step is to identify everybody they’ve interacted with.

S1: Contact tracing isn’t new by any stretch of the imagination. It’s been a pillar of infectious disease response for decades. But back in March, there was a lot of debate about how we should trace contacts in one camp, the old tried and true human method, pick up the phone and make some calls.

S2: Whenever there’s any type of outbreak, you should have the human infrastructure being the people who are trained and the call centers who are enabled and ready to respond to an outbreak. But whether it’s because of austerity or bad government planning, very few governments anywhere had contact tracing from a human perspective enabled. So that’s to an extent why governments then instead rushed to app based contact tracing.

S1: Humans make mistakes. Humans forget to report a software based solution might take the guesswork out of the problem right at base. Tracing was supposed to track your exposure and tell you when you need to get a test or isolate. Countries around the world. All went to work developing apps. Some built up their human tracing capabilities at the same time, others didn’t. But perhaps more than anyone else, it was the UK where Gus lives that trumpeted its ability to make an app that would help contain the virus. And just a few months after they started work on the app, they abandoned it.

S3: Now, there’s been a major change in the UK government’s approach to tracing people using a smartphone, an app developed by the NHS. The government’s abandoned its bespoke contact tracing app in favour of technology and Google.

S4: Today on the show, we’ll dig into the UK’s saga with contact tracing technology. Just as the nation is poised to release the new and improved version of its app. Critics call the UK’s efforts a fiasco and a disaster. We’ll find out why Britain’s failure became the world’s problem. I’m Celeste Headlee, filling in for Lizzie O’Leary. And you’re listening to What Next TBD, a show about technology, power and how the future will be determined. Stay with us.

S1: Back in March, when it was clear that covid-19 was spreading rapidly across Europe, the UK had a problem much like the U.S. their ability to test for the virus was extremely limited. So while they raced to increase their testing capacity, they put their faith in a technological solution.

S2: When they embarked on this, it was going to be the primary government response to this entire pandemic. While other governments, such as Germany, first deployed testing capabilities and then got their hospitals ready for the flow of patients, the UK prioritized the development of an app to do all things. And that is the app wasn’t just to do contact tracing. The app was there to detect whether or not you are at risk. And so it wanted to use as much data as it could in order to compute whether or not based on you not feeling well and based on your interactions with others, could they guess essentially whether or not you had the virus?

S5: So it was supposed to be the same extraordinary intelligence exercise to compensate for the fact that they didn’t have testing.

S1: The NHS spent months developing their test and trace app, and in early May, they launched a limited trial of the newly minted tech on the Isle of Wight. That’s an island on the south coast of England. The app worked or it was supposed to work by enabling smartphones to communicate with each other via Bluetooth. Users would enter their health data, which would then be shared to a centralized server. If they had covid symptoms, other users they’d come in close contact with would be alerted and directed to self isolate. Gus was one of 11 people on an ethics advisory board for the project, and so he got an early look at the app and it didn’t take long for all of them to find problems.

S5: When we audited the app, it showed that it was the moment you opened it, it was contacting Microsoft and Google. And it’s forgivable to some degree, because to be fair to the people working on this app, they were the front line of the pandemic response in the UK. And so they were working 24 hour days trying to create an app and whatever code they can grab from wherever they were just putting it in to create an app that could be deployed. But that’s different from creating an app that is privacy friendly and secure and trustworthy. But the bigger finding in our test that we didn’t talk about that much was that it wasn’t working. We were testing whether or not it was detecting Bluetooth connections with other devices around and nothing we could do could get it to detect a phone that was just right beside it. And when we reported this to the developers, their response was, oh, no, no, no, we’ve got this all sorted. And so this is where we get to the point where as a representative of the privacy community, I wasn’t often raising privacy issues about the app. I was raising questions as to whether or not it would work. And second, who it would work for. Could they be deployed on every mobile phone or only on the most recent mobile phones, the ones that cost a thousand dollars plus, can it be used by everybody in all walks of life or just by the people who are in professional services? I kept on demanding for the actual data to see. Have you tested it? Can you show me that the devices that I had actually works on? Can you show me the response rates? And they kept on saying, oh no, it works fine, it works fine, it works fine. Until finally in June, they had to admit actually it was only recording iPhones at four percent of the time, which is a disastrous result.

S6: And June 18th, they announced there were not going to launch the app. What happened as a result? What was the fallout?

S5: The fallout from the failure of the app was that the government had invested so much political capital in finally being competent on one thing because it had failed on protective gear, it had failed on deployed testing and had failed on care homes. And the numbers of deaths were rising and the prime minister had even been hospitalized. And so they needed a win. And in that period of time, they were bigging up this app. They were making this app sound like it was the solution to everything, because it had to be the solution to everything, because they had no other solutions elsewhere. And so the political capital and the public trust capital that they were investing in this app all got wasted. And I think even the public were hoping that tech would be the solution. And everybody woke up the next morning with a horrible hangover from this entire four months of wasted energy and opportunity.

S6: A number of apps have been launched. Germany has has touted its. Version, the state of Virginia just launched its app recently, have you seen an app that you could give a rubber stamp of approval to?

S5: So most of the apps you just listed off have shifted over to using the Google Apple implementation, which is, as far as we can see into it, a very good design from the privacy and security perspective and theory. People should feel safe as they download those apps and use those apps.

S1: The Google Apple tracing tool went public on May 20th. Like the UK’s app, it relies on Bluetooth technology. Unlike the UK’s system, it doesn’t collect data and share it to a central database. It’s useful for individuals but does not supply health information back to governments.

S6: There are complaints about the Apple and Google software. I’m thinking of Switzerland and and and the Swiss Health Department has really complained about Swiss covid. It’s the name of their app. I want to read you the statement from the Swiss Health Department spokesman who said, we don’t know and have no way of finding out the number of people warned by the app or any false positives or false negatives. Essentially, they’re complaining that there’s no way to get good statistics that they could use for public health purposes. What do you make of that?

S5: The Swiss response is entirely right. They can’t learn about the nature of the disease. They can’t learn about transmission and all those things that would be helpful. The Apple Google model currently precludes any of that kind of sharing of data with the public health agency. Apple and Google have made some decisions saying if governments want to collect data, they can collect data, but they’re not going to collect it via a covid contact tracing app on our operating systems. And I think it’s their right to say as such. But it’s also given rise to concern that Apple and Google have far too much power to make these decisions.

S6: And I wonder what you make of that concern as both a privacy expert and as a human?

S5: I’ve struggled a lot with this question, but I’ve seen the way that this process takes place across the world. I’ve seen the abuses that have arisen around the pandemic response, but generally around governments who just can’t stop themselves when it comes to the ability to get access and exploit data about their citizens and around dissidents and how they treat people generally and marginalised people and target people. So Apple and Google made a call that public trust was more important. But they also very importantly, they they they are not calling it a contact tracing app. They’re calling an exposure notification. And that that might sound like semantics, but it’s really important differentiation. It’s not designed to replace contact tracing, which is that centralized government administer administered initiative. It is just a helpful tool what governments have had to wake up to.

S2: And this was what particularly the UK government had to wake up to because they wanted the app to replace all human processes when they realized that their app wasn’t going to work because it was fanciful, they then ramped up the human processes.

S6: It’s difficult, though, and I wanted to dig a little deeper into some of the privacy implications because contact tracing is, by its very nature, invasive. And many countries, not just the U.S., not this the UK have found people are not always willing to participate, even with the human contact tracers. You you seeing high levels of mistrust among especially people of color. Some people even say that they’re afraid they don’t want to be blamed, meaning they don’t want anyone contacting their friends and saying this person has covid. How do you get past just the inherent privacy problems involved in contact tracing?

S2: Now, you you summarized it perfectly and the sense it is a question of trust more than anything else. And if in order to enhance that trust or to be deserving of that trust, you first have to have a coherent response to a pandemic and then you can do the expected things around privacy and security. But if as a government, you have failed repeatedly in history to take care of people and have an effective public health system, you can’t expect people to immediately respond. Yes, OK, now I trust you with with all my with all my data. People are incredibly nervous about coming forward with this, and that’s where an app was a grand opportunity. And I remember speaking to German public health officials who they, too, were seeking a UK like solution, where they were centralizing all the data and going to use the data for research.

S5: But when Google and Apple came up with their solution, the Germans shifted almost instantaneously towards this more privacy friendly, decentralized service. And I asked them why, and they said it’s because of trust. We want as many people as possible to download this app and whatever we can do to enhance trust is a good thing. And every government that we’ve looked up close to, apart from the German example, they have all failed to some regard on this. Like when the UK decided to finally move to human contact tracing, they then said, actually, we’re going to keep this data for 20 years and say why? And it’s just because it’s government think it’s this copying and pasting text from other places. But they don’t understand that they’re dealing not just with an emergency, they’re dealing with a public health trust crisis as well.

S6: This is a dual layer of trust, though, because not only have governments abused the private information of citizens in many nations, but so have software companies. If these contact tracing apps are built on Google software and people have already been burned because they didn’t realize Google Maps was tracing all of their movements, for example, or Apple, the same thing, that also is a layer of trust that has been broken.

S5: This pandemic possibly happened at the worst possible time in modern history, where to one side you have the misinformation, disinformation pandemic that’s going on and you have as as you rightly identify the the crisis of public trust in companies and governments. And it’s all happening at the same time. Now, that’s a negative way of looking at it. But another way of looking at it is that if this had happened 10 years ago with the way that the Googles and the Facebook and all these other companies were operating freely and getting away with it left, right and center, they would have just been trying to centralize that data and advertise to you on the back of it. Fortunately, we’ve come so far in the last particularly five years, I’d say, with the Cambridge Analytica failure and countless failures from all the other companies, that these companies are now trying to some degree to be better or they recognize that they can’t be caught abusing public trust again at this moment of crisis. And so I think Apple and Google actually listen to the better angels and try to do something interesting.

S1: Now, after a failed first attempt to create a tracing app, the UK is trying again. This time they’re using the Google Apple technology. Guss is more hopeful about this version of the app, but still cautious.

S5: To be frank, when the UK finally gets its act together and releases its own app. After we’ve done our analysis on it, I’m looking forward to use it because it is a very helpful tool if done right. But we have to watch these apps over a longer period of time because the problem with any technical solution is that a simple change of code can result in a completely different result.

S2: One of the issues that was arising in the UK in the first version of their app is that they accidentally, I think, told us that they eventually want to use the app for quarantine enforcement so that it would notify whether or not you are quarantining as required. And that’s a very different app. That’s an app that went from helping a public health emergency and helping you navigate a public health emergency into essentially imprisoning you with through an app. So watching these apps very closely, watching the government agencies that are deploying these apps, watching the software houses that develop these apps, and making sure we keep a constant eye on Google and Apple to make sure they don’t make any further changes. That’s what’s needed in order to maintain the trust that where we’re willing to to to give to these companies and these agencies in this moment of crisis.

S6: Gus, first of all, thank you for keeping your eye on this and also thank you so much for talking with us.

S7: Thank you.

S8: Gus Hosain is the executive director of Privacy International. That’s it for today’s show. TBD is produced by Ethan Brooks and edited by Allison Benedikt and Tori Bosch. TBD as part of the larger What Next family TBD is also part of Future Tense, a partnership of Slate, Arizona State University and New America. I’m Celeste Headlee. Thanks so much for listening and have a great weekend. A new episode of What Next will be in your feed Unland.