Google-Apple contact-tracing partnership could help us create a shared sensibility about privacy
On April 10, when Apple and Google announced an unprecedented partnership to build a contact tracing protocol, some privacy advocates reacted with alarm and skepticism. Members of the world’s digital privacy community are right to be cautious, but they shouldn’t dismiss the idea out of hand. I don’t know who needs to hear this, but contract tracing phone apps are not the answer,” tweeted Eva Galperin, the director of cybersecurity at the Electronic Frontier Foundation.
To reopen large parts of the economy without effective therapeutics or a vaccine, we need a way for public health authorities to rapidly determine who individuals might have come into contact with if they test positive for Covid-19.
So far, we don’t know how well the government-engineered contact tracing programs that Israel, China, and Singapore have built are working. (Singapore’s version doesn’t catch every infection. Indeed, Google and Apple decided to tackle problem because the testing world was a wild west; prominent institutions asked the two companies to devise a solution. This week, the U.S. government dithered over whether to allow the Justice Department to obtain, without a warrant, internet metadata of . Trust in government is not at an apogee. The Apple/Google contact-tracing partnership is not only our best bet, but, if done right, it can create a model for privacy protection that would be worth emulating for infectious disease surveillance in the future.
In early May, the companies provided application programming interfaces to public health authorities. These will allow the people who make community-based public health decisions to build applications on top of the new contact-tracing protocol. (Switzerland launched the first, this week.)
But this piecemeal approach won’t capture nearly enough data to trace at scale. That’s why Apple has updated its iOS with a built-in version of a tracing interface that users will be able to toggle on or off. Google’s latest Android operating system has the same feature.
Users will need to download an app, and then decide to opt-in, to participate. Phones with Google and Apple’s protocol enabled would use Bluetooth to broadcast and collect signals from other phones that also have contract tracing enabled. According to the working papers released by the companies, this relies on several layers of security to protect the identity of users and to give them the authority and responsibility to share data. The signals will correlate to proximity, not location, so users won’t have their locations tracked.
If you tested positive for the virus, you could choose whether to let the app know. If you choose “yes,” the app would alert the public health authority or the third-party developer administering the app. When it comes to sensitive patient data, the servers would be wiped clean of data periodically, and when the service is no longer needed, Apple and Google would be able to turn it off.
It’s important to note here that, in both phases of the program, neither Apple nor Google would be able to identify you as someone who has tested positive for Covid-19. Only your health agency would — and you would have given them permission. At this stage, the health authorities would validate the diagnosis using their own standards, and then be able to quickly notify other people. Under this approach, the public benefits because Covid-19 hot spots could be more quickly identified and mitigated without unnecessary and intrusive surveillance.
Users also benefit, because they get to decide what to share.
Indeed, the only way the system can work is if a large majority of people decide to opt in. This will be a hard pull, and it won’t happen overnight. If the software turns out to be glitchy or hard to use, or if it can easily be appropriated for malicious purposes, people will rightly be skeptical. The choice requires users to decide for themselves whether or not they should trust the protections that are built into it. This is a psychological gamble because it assumes that our sense of collective responsibility will persuade us to forgo a measure of personal privacy.
But if there’s buy-in from local health authorities and ample social pressure on top of that, it would meaningfully scale efforts that, to this point, require lots of humans making lots of phone calls to accomplish. (Santa Clara county in California, which shut down early and contained its spread, has been able to hire only a fraction of the number of people they need.)
They can’t reach everyone, because people who use some older phones won’t have the technology and because many people don’t have phones. There are age, class, racial and geographical divides here — inequalities that the companies could help remedy by distributing free devices to underserved communities. Given the disparities in Covid-19 morbidity outcomes, this step should be an essential part of any contact-tracing system.
We should use this opportunity to regain some agency over our privacy. We’d like to keep our health status private, but we can’t in all instances, because we may endanger our neighbors if we carry Covid-19. For years, civil liberties advocates have pressured the tech industry to adopt the standards of transparency and privacy that Apple and Google have now chosen to use in a project that may be essential to the fate of a functioning society.
Google, Amazon and Facebook, and, to a much lesser extent, Twitter, have created a market that consigns human agency to an automatic click of a privacy statement or a complicated tour through buried settings. Their business model has been called parasitic, where basic life functions now rely on our provisioning to these and other companies our own personal stuff, without our explicit consent or knowledge. Soshana Zuboff coined the term “surveillance capitalism” to refer to this inscrutable arrangement. (It is with only slight irony that I’ve linked to her author page on Amazon.). The pandemic provides all of us with an opportunity to appreciate this phenomenon, and to make good choices about what becomes next. The more aware we are of this panopticon, the more we can design institutions that don’t rely on free exchange of data to survive. But it also provides us with a case limitation. We’d like to keep our health records private, but we cannot in all instances, because we endanger our vulnerable neighbors if we carry an infectious disease without knowing it. The virus has ripped human agency away from us, and Google and Apple can help us reclaim it.
Note: like many people, I’ve applied to jobs at all the platforms mentioned here and I probably will in the future. Please factor that in when evaluating my opinion.