The COVID-19 pandemic has spread like wildfire across the globe . Very few countries have managed to keep it well-controlled, but one of the key tools that several such countries use is contact tracing . More specifically, whenever an individual is diagnosed with the coronavirus, every person who had possibly been near that infected individual during the period in which they were contagious is contacted and told to self-quarantine for two weeks . In the early days of the virus, when there were only a few cases, contact tracing could be done manually. With hundreds to thousands of cases surfacing in some cities, contact tracing has become much more difficult .
Countries have been employing a variety of means to enable contact tracing. In Israel, legislation was passed to allow the government to track the mobile-phone data of people with suspected infection . In South Korea, the government has maintained a public database of known patients, including information about their age, gender, occupation, and travel routes . In Taiwan, medical institutions were given access to patients’ travel histories , and authorities track phone location data for anyone under quarantine . And on March 20, 2020, Singapore has released an app that tracks via Bluetooth when two app users have been in close proximity, and when a person reports they have contracted the coronavirus, anyone who was logged to be near them is automatically warned to self-quarantine.
Solutions that have worked for some countries may not work well in other countries with different societal norms. We believe that in the United States, in particular, the aforementioned measures are unlikely to be widely adopted. On the legal side, publicly revealing patients’ protected health information (PHI) is a violation of the federal HIPAA Privacy Rule , and the Fourth Amendment bars the government from requesting phone data without cause . Some of these norms may be suspended during times of crisis—HIPAA has recently been relaxed via enforcement discretion during the crisis to allow for telemedicine , and a public health emergency could well be argued to be a valid cause . However, many Americans are wary of sharing location and/or contact data with tech companies or the government, and any privacy concerns could slow adoption of the system .
Singapore’s approach of an app, which gives individuals more control over the process, is perhaps the most promising solution for the United States. However, while Singapore’s TraceTogether app protects the privacy of users from each other, it has serious privacy concerns with respect to the government’s access to the data. In this document, we discuss these privacy issues in more detail and introduce approaches for building a contact tracing application with enhanced privacy guarantees, as well as strategies for encouraging rapid and widespread adoption of this system. We do not make explicit recommendations about how one should build a privacy-preserving contact tracing app, as any design implementation should first be carefully vetted by security, privacy, legal, ethics, and public health experts. However, we hope to show that there exist options for preserving several different notions of user privacy while still fully serving public health aims through contact tracing apps.
2 Singapore’s TraceTogether App
On March 20, 2020, the Singaporean Ministry of Health released the TraceTogether app for Android and iOS . It operates by exchanging tokens between nearby phones via a Bluetooth connection. The tokens are also sent to a central server. These tokens are time-varying random strings, associated with an individual for some amount of time before they are refreshed. Should an individual be diagnosed with COVID-19, the health officials will ask***While the health officials ask, it is a crime in Singapore not to assist the Ministry of Health in mapping one’s movements, so ‘ask’ is a bit of a misnomer . them to release their data on the app, which includes a list of all the tokens the app has received from nearby phones. Because the government keeps a database linking tokens to phone numbers and identities, it can resolve this list of tokens to the users who may have been exposed.
By using time-varying tokens, the app does keep the users private from each other. A user has no way of knowing who the tokens stored in their app belong to, except by linking them to the time the token was received. However, the app provides little to no privacy for infected individuals; after an infected individual is compelled to release their data, the Singaporean government can build a list of all the other people they have been in contact with. We will formalize these several notions of privacy in Section 3.
3 Desirable Notions of Privacy
Here, we discuss three notions of privacy that are relevant to our analysis of contact-tracing systems: (1) privacy from snoopers, (2) privacy from contacts, and (3) privacy from the authorities. Note that in this document, we do not rigorously define what it means for information to be private, as this is a topic better left for future works; some popular definitions include information theoretic privacy , k-anonymity , and differential privacy . Furthermore, we discuss only these three notions of privacy to illustrate some of the shortcomings of direct contact-tracing systems. Other recent work has presented a useful taxonomy of the risks and challenges of contact tracing apps .
For any contact tracing app that achieves the aim of telling individuals that they might have been exposed to the virus, there is clearly some amount of information that has to be revealed. Even if the only information provided is a binary yes/no to exposure, a simple linkage attack  can be performed: if the individual was only near to one person in the last two weeks, then there will be an obvious inference about the infection status of that person. The goal is of course to reduce the amount of information that can be inferred by each of the three parties (snoopers, contacts, and the authorities) while still achieving the public health goal of informing people of potential exposures to help slow the spread of the disease.
Of note, here we use a semi-honest model for privacy , where we do not consider the possibility of malicious actors polluting the database or sending malformed queries, but rather instead just analyze the privacy loss from the information revealed to each party. A nefarious actor could, for example, falsely claim to be infected to spread panic; this is not a privacy violation, though we do consider this further in the Discussion. Alternately, when a server exposes a public API, queries can be crafted to reveal more information than intended by the system design, which is indeed a privacy violation. We leave a more thorough analysis of safeguards for the malicious model to future work.
3.1 Privacy from Snoopers
Consider the most naïve system for contact tracing, which no reasonable privacy-conscious society would ever use, where the app simply broadcasts the name and phone number of the phone’s owner, and nearby phones log this information. Then, upon diagnosis of COVID-19, the government publishes a public list of those infected, which the app then checks against its list of known recent contacts. This is clearly problematic as a nefarious passive actor (a ‘snooper’) could track the identities of people walking past them on the street.
A slightly more reasonable system would assign a unique user-ID to each individual, which is instead broadcast out. This does not have quite as many immediate security implications, though all it would take is a nefarious actor linking each ID to a user before one runs into the same problem, which is known as a ‘linkage attack.’ Given how easy and common linkage attacks are, this approach also provides insufficient levels of privacy for users [26; 40].
The Singaporean app TraceTogether does better, in that it instead broadcasts random time-varying tokens as temporary IDs. Because these tokens are random and change over time, someone scanning the tokens while walking down the street will not be able to track specific users across different time points, as their tokens are constantly refreshed. Note that the length of time before refreshing a token is an important parameter of the system (too infrequent and users can still be tracked, too frequent and the amount of tokens that need to be stored by the server could be huge), but with a reasonable refresh rate, the users are largely protected against attacks by snoopers in public spaces.
3.2 Privacy from Contacts
Here, the term contact is defined as any individual with whom a user has exchanged tokens in the contact tracing app based on some notion of physical proximity. Privacy from contacts is harder to achieve, because the information that needs to be passed along is whether one of the individual’s contacts has been diagnosed with COVID-19, so some information has to be revealed.
The TraceTogether app gives privacy from contacts by instead putting trust in government authorities. When TraceTogether alerts a contact that they have been exposed to COVID-19, the information comes directly from the Singaporean Ministry of Health, and no additional information is shared (to our knowledge) that could identify the individual that was diagnosed. Thus, TraceTogether does protect users’ privacy from each other, except for what can be inferred based on the user’s full list of contacts, as the only information that is revealed to the user is a binary exposure indicator, which is arguably the minimum possible information release for the system to be useful.
3.3 Privacy from the Authorities
Protecting the privacy of the users from the authorities, i.e. whoever is administering the app, whether that is a government agency or a large tech company, is also a challenging task. Clearly, in the absence of a fully decentralized peer-to-peer system, any information sharing among phones with the app installed will have to be mediated by some coordinating servers. Without any protective measures (e.g. based on cryptography), the coordinating servers are given an inordinate amount of knowledge.
TraceTogether does not privilege this type of privacy, instead making use of relatively high trust in the government in its design. While it does not deliberately gather more information than necessary to build a contact map—for example, it does not use GPS location information, as Bluetooth is sufficient for finding contacts—it also does not try to hide anything from the Singaporean government. When a user is diagnosed with COVID-19 and gives their list of tokens to the Ministry of Health, the government can retrieve the mobile numbers of all individuals that user has been in contact with. Thus, neither the diagnosed user, nor the exposed contacts, have any privacy from the government.
Furthermore, because the government maintains a database linking together time-varying tokens with mobile numbers, they can also, in theory, track people’s activities without GPS simply by placing Bluetooth receivers in public places. There is no reason to disbelieve the TraceTogether team when they state that they do not attempt to track people’s movements directly; however, the data they have could be employed to do so. Citizens of countries such as the U.S. trust authorities much less than Singaporeans , so the privacy trade-offs that Singaporeans are willing to make may not be the same ones that Americans will accept.
|Privacy from snoopers||Privacy from contacts||Privacy from authorities||Infrastructure requirements|
|Exposed user||Diagnosed user||Exposed user||Diagnosed user|
|Trace Together ||Yes||Yes||Yes||No. Exposure status and all tokens revealed.||No. Infection status, all tokens, and all contact tokens revealed.||Minimal|
|Polling-based11footnotemark: 1 (§4.1)||Yes||Yes||Yes22footnotemark: 2||Partial. Susceptible to linkage attacks.||Partial. Susceptible to linkage attacks.||Low. Single server.|
|Polling-based with mixing (§4.3)||Yes||Yes||Yes22footnotemark: 2||Almost private. Protects against linkage attacks by mixing tokens from different users.||Almost private. Protects against linkage attacks by mixing tokens from different users.||Medium. Multiple servers for mixing.|
|Public database (§4.4)||Yes||Yes||Partial. Info leaked at time of token exchange.||Yes||Partial. Susceptible to linkage attacks.||Communication cost to phones is high.|
|Private messaging system (§5)||Yes||Yes||Partial. Info leaked at time of token exchange. 33footnotemark: 3||Yes||Yes||High. Multiple servers performing crypto.|
Augmenting with random tokens does not improve privacy.
However, if contacts are malicious, and they send malformed queries (e.g. a query that includes only a single token), the diagnosed individual only has the same privacy level as in the public database solution. Namely, there’s only partial privacy because information is leaked through knowing the time of token exchange.
This information leakage might be fixable using data aggregation based on multi-key homomorphic encryption, but we do not do so here.
4 Privacy-Enhancing Augmentations to the TraceTogether System
Here, we discuss potential approaches to build upon the TraceTogether model to obtain a contact tracing system with differing privacy characteristics for the users. Though important and highly nontrivial, various technical and engineering challenges behind the exchange of Bluetooth tokens 
are outside the scope of this document. Our abstraction is that there exists some mechanism for nearby phones to exchange short tokens if the devices come within 6 feet of each other—the estimated radius within which viral transmission is a considerable risk. We are primarily concerned with the construction of those tokens, and how those tokens can be used to perform contact tracing in a privacy-preserving manner.
First, we formally describe the TraceTogether system. Let Alice and Bob be users of the app, and let Grace be the government server (or other central authority). Alice generates a series of random tokens , one for each time interval, and Bob generates a similar series of tokens , all drawn randomly from some space . They also both report their list of tokens and , as well as their phone numbers to Grace. At a time , Alice and Bob encounter each other, exchanging and . Alice and Bob keep lists of contact tokens and respectively. These consist of tokens from every person they were exposed to; i.e. and because Alice and Bob exchanged tokens at time . Five days later, Bob is diagnosed with COVID-19, and sends his list of contact tokens , which includes , to Grace. Grace then matches each to a phone number, reaches out to those individuals, including Alice, and advises them to quarantine themselves because they may have been exposed to the virus.
4.1 Partially Anonymizing via Polling
Instead of having Grace reach out to Alice when Bob reports that he has been diagnosed, a more privacy-conscious alternative is for Alice to “poll” Grace on a regular basis. In this setting, Grace maintains the full database, and Alice asks Grace if she has been exposed. This alternative does not require Alice and Bob to send their phone numbers to Grace. In this setting, there are two reporting choices for when Bob wishes to declare his diagnosis of COVID-19. Bob can send his own tokens to Grace, or he can send the contact tokens to Grace. In the former case, Alice needs to send Grace her contact tokens to see if any have been diagnosed with COVID-19. In the latter case, Alice needs to send Grace her own tokens to ask if any of them have been published. Either way, Grace is able to inform Alice that she has been exposed, without revealing Bob’s identity. This presupposes that Alice is Honest but Curious (semi-honest); if Alice is malicious and crafts a malformed query containing only the token she exchanged with Bob, she may be able to reveal Bob’s identity.
Note that in either version of this system, individuals still have privacy from snoopers and from contacts. However, they additionally gain some amount of privacy from authority, as Grace does not have their mobile numbers. Of course, Grace does have some ability to perform linkage attacks. If Bob publishes to Grace his own tokens upon being diagnosed, and Alice queries Grace with all her contact tokens , then Grace can attempt to link those sets of tokens to individuals or geographic areas; further, Grace can also monitor the source of Alice and Bob’s queries (i.e. IP addresses of phones). For example, if Grace has Bluetooth sensors set up in public places, she can then trace Alice and Bob’s geographic movements. That kind of location trace is often sufficient to deanonymize personal identities . Alternatively, the same is true if Bob publishes his contact tokens to Grace and Alice queries Grace with her own tokens. Thus, there is not perfect privacy from the authorities, but still better than in the original TraceTogether system, at the cost of potentially lower privacy for Bob in the malicious model.
4.2 Ineffectiveness of Adding Spurious Tokens for Further Anonymization
To further anonymize the polling-based system to increase privacy from authorities, there are a number of techniques that can be used to hide Alice and Bob’s identities. Let’s begin with a simple approach—that doesn’t actually work—to give some intuition before moving on to more effective approaches. Consider injecting random noise by augmenting the data with artificial tokens. Whenever Alice and Bob send information to Grace (either in the form of a diagnosis report or a query), they can augment their tokens with random ones. Note that some care has to be taken in deciding which distribution to draw the random tokens from. Not only should the system keep the probability of spurious matches low, but the distributions should also be designed to make inferences by Grace difficult.
For example, assume that Alice and Bob sample their tokens uniformly at random from , where is chosen to be sufficiently large that accidental collisions between individuals’ tokens are unlikely. Suppose Bob sends to Grace his own tokens upon being diagnosed, and Alice queries Grace with all her contact tokens . In theory, Bob could augment his own tokens with a set of random tokens drawn uniformly from , and send those to Grace as well. Unfortunately, was chosen to prevent accidental collisions; this means that the probability that the additional random tokens correspond to the tokens broadcast by any individual is vanishing small. But then, there is actually little to no privacy gained. Grace can just assume that the augmented set of tokens correspond to Bob, and perform the same linkage analysis that she would with only the correct set of tokens. This does nothing but pollute Grace’s database with extra data, without affording any real privacy gains for Bob. Similarly, Alice also cannot obfuscate her exposure through Bob from Grace, because any extra tokens she sends to Grace will not change the fact that she has Bob’s token as one of her contacts.
The root of the problem is that Grace has access to the universe of all tokens through user queries, and so can simply filter out all of the random tokens generated. Thus, random noise is ineffective for hiding information from Grace.
4.3 Enhancing Anonymity by Mixing Different Users’ Tokens
Although introducing spurious random tokens into the system achieves little in terms of privacy, as discussed in the previous subsection, a slight modification of this idea leads to meaningful privacy guarantees. The issue is that Grace has access to the entire universe of tokens, as well as both of the sets of tokens corresponding to Alice and Bob, possibly augmented with random noise. Instead of hiding true tokens with random noise, suppose the system includes a set of honest-but-curious non-colluding “mixing” servers not controlled by Grace that aggregate data before forwarding it on to Grace.
When Bob is diagnosed with COVID-19, he partitions the tokens he wishes to send (depending on the setup of the system, either his own tokens, or those of his contacts) into groups, and sends each group to one of the mixing servers. The mixing servers then combine Bob’s data with that of other users diagnosed with COVID-19 before forwarding it onto Grace. Similarly, Alice does the same thing for querying, except she also needs to wait on a response from the mixing server for each of the tokens she sends. The linkage problem then becomes much more difficult for Grace, because the valid tokens for individuals have been split up. Similarly, each mixing server only has access to a subset of the tokens corresponding to each individual, making the linkage analysis more difficult for them. Of course, if the mixing servers collude, then the privacy reduces to that of the standard polling-based approach.
Note that this approach can also be simulated without the mixing servers by either Alice or Bob if they have access to a large number of distinct IP addresses. They can simply send their queries and tokens with some time delay from the different IP addresses, preventing Grace from linking all of them together. However, this approach may not be feasible for most users.
4.4 Public Database of Infected Users’ Tokens is Efficient but Less Private
Alternatively, Grace can simply publish the entire database of tokens she receives from infected individuals, including the ones from Bob. If Alice simply downloads the entire database, and locally queries against it, then no information about Alice’s identity is leaked to Grace.
This approach may seem less computationally feasible, especially on mobile devices. In circumstances where the total number of people infected is not very high, this approach works, as evidenced by the South Korean model , though the approach may fail as the epidemic reaches a peak. However, the computational and transmission cost can be partially ameliorated by batching together Grace’s database, so that Alice is not downloading the entire thing. For example, in the version where Bob sends his own tokens to Grace, Alice can download batches corresponding to her contact tokens . If each batch has e.g. 50 tokens, then Grace does not know which of those 50 tokens Alice came into contact with.
Unfortunately, it is worth noting that this approach decreases Bob’s privacy from Alice, because Alice knows when she encountered the token Bob sent; she can then limit the number of possible individuals who could have sent the token based on who she was in contact with during the time she encountered Bob’s token. If the token she exchanged with Bob is present in the database, she gets a hint as to the disease status of one of the individuals she was in contact with during the token exchange.
5 Privacy from Authorities based on Private Messaging Systems
None of the easy-to-implement augmentation ideas given in Section 4 guarantee full privacy from the authorities. At a cost of more computation, however, we believe that a solution for secure contact tracing can be built using modern cryptographic protocols. In particular, private messaging systems [46; 45; 5] and private set intersection (cardinality) [13; 22; 8; 7] protocols seem especially relevant. The sketch we provide below is based on private messaging systems, though we do not claim this to be an optimal implementation.
We will give the intuition here before going into technical details necessary for an effective implementation. First, we replace the random tokens exchanged by Alice and Bob with random public keys from asymmetric encryption schemes . The matching secret keys are stored locally on each of Alice’s and Bob’s phones. Then, imagine that Grace has established a collection of mailboxes, one for each public key that Alice and Bob exchange. Additionally, we introduce Frank and Fred. Frank forwards messages to/from Fred. Fred forwards messages to/from Grace. They do not tell each other the source of the messages. At fixed time points after Bob’s contact with Alice (up to some number of days), Bob addresses a message to Alice encrypted using the public key Alice gave Bob. Bob gives the message to Frank, who then forwards it on to Grace (through Fred), who puts it in Alice’s mailbox. The content of the message is Bob’s current infection status, and the reason he sends messages at fixed time points is to prevent Frank from figuring out Bob’s infection status from the fact that he is sending messages. Alice checks all of the mailboxes corresponding to her last several days worth of broadcasted public keys. In one of the mailboxes, she then receives and decrypts Bob’s message, and learns whether she has been exposed to the virus. Grace cannot decrypt the message Bob sends to Alice because it is protected by asymmetric encryption. Furthermore, to protect Alice’s privacy, she can also access her mailboxes through Frank and Fred, who deliver the messages in Alice’s mailboxes to her without revealing which mailboxes she owns.
Contact tracing can be viewed as a problem of secure communication between pairs of users who came into contact in the physical world. The communication patterns of who is sending messages to whom can reveal each individual’s contact history to the service provider (Grace). This notion is known as metadata privacy leakage in computer security , where the metadata associated with a message (e.g. sender/recipient and time) is considered sensitive, in addition to the actual message contents. In the contact tracing case, such metadata could reveal who has been in contact with whom, potentially revealing the users’ sensitive activities. We believe that recent technical advances [23; 46; 5] for designing scalable private messaging systems with metadata privacy present a promising path for developing a similar platform for secure contact tracing.
Following recent works, our idea is to leverage a ‘mix network’ , which is a routing protocol that uses a chain of proxy servers (Frank/Fred) that individually shuffle the incoming messages before passing them onto the next server, thereby decoupling the sender of each message from its destination—these types of mix networks are perhaps most well-known for being the basis of the Onion Router/Tor anonymity network . This is a more sophisticated use of mixing servers than described in Section 4.3 for the polling based solution. When Bob wishes to send his encrypted message to Alice, he first encrypts it multiple times with public keys corresponding to each of the servers in the mix network. Because the messages are encrypted in multiple layers, and each server peels only the outermost layer, the final destination (Alice’s mailbox) is revealed only to the last server, and only Alice can read the content of the message (i.e. infection status). To prevent Grace from learning the identity associated with each mailbox, Alice can also access her mailboxes through the mix network, which shuffles the traffic to decouple the mailboxes from their owners. As long as one of the servers is neither breached nor controlled by the adversary, the final message cannot be linked to a specific sender even if the adversary has full control of the rest of the network. Such a system for private communication could allow the users (Bob) to share their infection status with their recent contacts (Alice) while hiding the metadata of their contact patterns from the service providers. The involvement of non-government entities, such as an academic institution or a hospital, in the mix network may help increase users’ trust in the system and lower the bar for adoption.
There are several remaining issues that will need to be addressed for this system to be widely adopted. First, if time-varying IDs are used, then the user receiving a token from a nearby person could infer the identity of the sender based on their travel history; i.e. Alice might be able to infer who Bob is based on the time they exchanged the tokens, as described in Section 4.4 in the case where the database is made public. This loss of privacy from contacts can be partially alleviated by choosing a less frequent token refresh, so that with high likelihood, Alice cannot completely identify Bob by the time interval. Actual implementations much decide on the right tradeoffs between Alice and Bob’s privacy from eachother and authorities, as well as contact tracing effectiveness. Another possible way to mitigate this problem would be to aggregate the messages for Alice on the server before making the results available to her. The messages are encrypted under different public keys, but it may be possible to use multi-key homomorphic encryption schemes [25; 4] which allow computation over ciphertexts encrypted with different public keys to sum up the count of ‘infected’ messages. We defer the details of approach to future work.
One other issue is that the volume of messages delivered to each user may reveal how socially active each user has been, which could be considered sensitive by some users. Approaches to flatten the distribution with dummy messages could alleviate this concern. Flattening the distribution with dummy messages may however lead to scalability challenges for existing private messaging systems. Though many techniques [23; 46; 5] have been proposed to address this challenge, further discussion among the stakeholders is needed to determine the suitable trade-off between the level of latency that can be tolerated and the level of privacy guarantees desired by the users. Ultimately, though, private messaging systems enable provable privacy from the authorities while still maintaining the usefulness of contact tracing.
6 Strategies for Encouraging Widespread Adoption
Contact tracing apps depend on the network effect and critical mass to work. Having the app go ‘viral’ requires that people trust the app enough to install it and are enthusiastic enough to convince their friends to do the same. After all, app adoption must have a higher ‘transmission rate’ than the virus itself in order for it to be effective. Providing strong privacy guarantees would likely encourage voluntary adoption. Any app needs to clearly explain privacy guarantees in ways understandable by the average user, which was our motivation in describing here the different types of privacy (from snoopers, contacts, and the authorities) that the app should be able to provide to users in order to earn their trust.
On that note, we believe it is imperative for any app to be open source and audited by both security professionals and privacy advocates. This is not yet true for TraceTogether, but the app’s creators do claim that they will release the source code soon. Furthermore, open sourcing allows different countries to customize such apps for their particular use cases and cultural preferences.
Also, while in some countries it may be difficult to enforce a government mandate that all residents install an app, it is possible to have this as a requirement for entering certain public places. Such a practice has precedence in so-called implied consent laws, such as agreeing to field sobriety tests when getting a driver’s license . One could imagine grocery stores, schools, and universities requiring installing a contact tracing app as a precondition for entrance. This does not stop users from uninstalling or turning off the app off-premises, but it would at least be useful in getting people over the initial activation barrier of installation.
Finally, some amount of social pressure may also assist in reaching widespread adoption. Contact tracing apps, by design, know how many other people close by have the app installed. An app could display that number. Given this knowledge, a user may be incentivized to attempt to persuade others nearby to install the app, in the interest of public health.
In this document, we discuss ways to build an app for contact tracing, based upon the premise that phones can broadcast tokens to all nearby phones. Notably, we do not address the engineering behind applying Bluetooth to enable such a feature. Nor do we address the possibility of location data collection for assisting epidemiologists in forecasting disease spread . We also do not discuss appropriate selection of token refresh interval and frequency at which phones should poll for nearby ones, which are important factors for balancing privacy and efficiency—stale IDs have been seen to permit linkage attacks in other similar contexts . Lastly, we also do not build a full model for privacy of contact tracing, which is a delicate and easy-to-get-wrong task that requires much more careful research. Instead, we focus only on the privacy implications of a dedicated contact tracing app, in the hopes that providing sufficiently strong privacy guarantees would assist an app in gaining the critical mass needed to be effective.
Note that here we only discuss direct contact tracing using Bluetooth proximity networks, without using any location data. Some indirect proposals for contact tracing instead simply securely log the user’s location history, which is then given to the authorities if a user is diagnosed with COVID-19 . This approach has the benefit of not requiring network effects, because single individuals can track their locations without needing their contacts to have the app. The approach of logging location history is inherently less private than direct contact tracing, but that may possibly be resolved with appropriate safeguards and redactions . Furthermore, hybrid approaches involving both GPS data and Bluetooth proximity networks may prove to be useful to public health officials in modelling disease spread beyond just contact tracing .
We first discussed how, with just minor modifications, a polling-based direct contact tracing solution allows for some anonymity from authorities, which is lacking in the Singaporean Ministry of Health app TraceTogether. We believe that this may help an app succeed in countries such as the U.S., where many citizens are loath to give too much data to the government.
Even the polling-based solution still reveals quite a bit of information to the authorities, who could make use of linkage analysis to track individual users. However, utilizing additional mixing servers is relatively practical and does provide additional protection. Alternately, a system can follow the South Korean model of openly publishing data about patients diagnosed with COVID-19, trading off some of their privacy to enhance the privacy of individuals who are trying to determine if they have been exposed.
However, if we are willing to invest in additional computational resources, it is possible to achieve increased privacy from snoopers, contacts, and the authorities, and we propose the beginnings of one approach using private messaging systems, which we hope will be further expanded upon in future works. This is more computationally expensive, but would assure users that they do not have to give up their privacy in order to take part in public contact tracing efforts. Indeed, the chief selling point would be that they would get additional information on their exposure without needing to trust any individual third party with their private location or medical information. We believe that such a guarantee would go a long way towards mass adoption of a contact tracing app in the United States.
Future work remains to actually build such an app, of course, and additional engineering, security, and policy considerations are sure to arise. For example, scalability of the data structures used in the servers may become a major issue when the number of infected individuals rises. One additional concern which we have not addressed is that of nefarious actors seeking to spread panic by falsely claiming to be infected. This could be prevented by allowing only hospital workers to trigger the broadcast of infection status, as in Singapore’s system, where the Ministry of Health directly contacts those exposed, though that of course trades away some of the privacy of diagnosed patients. Alternately, others have proposed cryptographic verification of contact events, which could perhaps be extended to infection event broadcast without giving direct access of tokens to the authorities . However, given that some cities are already rationing testing kits and doctors’ visits to only the most serious cases [9; 20], restricting self-reporting might result in many instances of virus spread to be missed. Alternately, the system can also be designed to separate self-reports from confirmed reports by simply keeping two databases.
Our goal in writing this document is to start a conversation on (1) what kinds of privacy trade-offs people are willing to endure for the sake of public health, and (2) the fact that with sufficient computational resources and use of cryptographic protocols, app-based contact tracing can be accomplished without completely sacrificing privacy. Because bad early design choices can persist long after roll-out, we hope that developers and policy-makers will give privacy considerations careful thought when designing new contact tracing apps.
We would like to thank David Rolnick, Adam Sealfon, Noah Daniels, and Michael Wirth for helpful comments.
-  (2019) January 20, 2019. External Links: Cited by: §3.3.
-  (2020) Coronavirus: Sacramento County Gives Up On Automatic 14-Day Quarantines. Note: https://www.npr.org/sections/health-shots/2020/03/10/813990993/coronavirus-sacramento-county-gives-up-on-automatic-14-day-quarantinesAccessed: 2020-03-23 Cited by: §1.
-  (1981) Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM 24 (2), pp. 84–90. Cited by: §5.
Efficient multi-key homomorphic encryption with packed ciphertexts with application to oblivious neural network inference. In Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, pp. 395–412. Cited by: §5.
-  (2015) Riposte: An anonymous messaging system handling millions of users. In 2015 IEEE Symposium on Security and Privacy, pp. 321–338. Cited by: §5, §5, §5.
-  (2020) COVID Watch. Note: https://covid-watch.org/ Cited by: §7.
-  (2012) Fast and private computation of cardinality of set intersection and union. In International Conference on Cryptology and Network Security, pp. 218–231. Cited by: §5.
-  (2010) Practical private set intersection protocols with linear complexity. In International Conference on Financial Cryptography and Data Security, pp. 143–159. Cited by: §5.
-  (2020-03) L.A. County gives up on containing coronavirus, tells doctors to skip testing of some patients. Los Angeles Times. External Links: Cited by: §7.
-  (2006) Calibrating noise to sensitivity in private data analysis. In Theory of cryptography conference, pp. 265–284. Cited by: §3.
-  (2014) The algorithmic foundations of differential privacy. Foundations and Trends® in Theoretical Computer Science 9 (3–4), pp. 211–407. Cited by: §3.
-  (2003) Contact tracing and disease control. Proceedings of the Royal Society of London. Series B: Biological Sciences 270 (1533), pp. 2565–2571. Cited by: §1.
-  (2004) Efficient private matching and set intersection. In International conference on the theory and applications of cryptographic techniques, pp. 1–19. Cited by: §5.
-  (1987) How to solve any protocol problem. In Proc. of STOC, Cited by: §3.
-  (2012) The devil is in the metadata—new privacy challenges in decentralised online social networks. In 2012 IEEE International Conference on Pervasive Computing and Communications Workshops, pp. 333–339. Cited by: §5.
-  (2020-03) Help speed up contact tracing with TraceTogether. Singapore Government Blog. External Links: Cited by: §2, Table 1.
-  (2000-12) HIPAA Privacy Rule. External Links: Cited by: §1.
-  (2020-03) How COVID-19 spreads. Centers for Disease Control and Prevention. External Links: Cited by: §4.
-  (2011) Is state power to protect health compatible with substantive due process rights. Annals Health L. 20, pp. 113. Cited by: §1.
-  (2020-03) Health officials in New York, California restrict coronavirus testing to health care workers and people who are hospitalized. The Philadelphia Inquirer. External Links: Cited by: §7.
-  (2020-03) A ‘travel log’ of the times in South Korea: Mapping the movements of coronavirus carriers . The Washington Post. External Links: Cited by: §1, §4.4.
-  (2005) Privacy-preserving set operations. In Annual International Cryptology Conference, pp. 241–257. Cited by: §5.
-  (2020) XRD: Scalable Messaging System with Cryptographic Privacy. In 17th USENIX Symposium on Networked Systems Design and Implementation (NSDI 20), pp. 759–776. Cited by: §5, §5.
-  (2020-03) Taiwan’s new ’electronic fence’ for quarantines leads wave of virus monitoring. External Links: Cited by: §1.
On-the-fly multiparty computation on the cloud via multikey fully homomorphic encryption.
Proceedings of the forty-fourth annual ACM symposium on Theory of computing, pp. 1219–1234. Cited by: §5.
-  (2012) Theoretical results on de-anonymization via linkage attacks. Transactions on Data Privacy 5 (2), pp. 377–402. Cited by: §3.1.
-  (2020) Coronavirus cases have dropped sharply in South Korea. What’s the secret to its success?. Note: https://www.sciencemag.org/news/2020/03/coronavirus-cases-have-dropped-sharply-south-korea-whats-secret-its-successAccessed: 2020-03-23 Cited by: §1.
-  (2020-03) Notification of Enforcement Discretion for telehealth remote communications during the COVID-19 nationwide public health emergency. External Links: Cited by: §1.
-  (2020-03) Novel Coronavirus Map from HealthMap. External Links: Cited by: §1.
-  (2018) Forecasting the spatial transmission of influenza in the United States. Proceedings of the National Academy of Sciences 115 (11), pp. 2752–2757. Cited by: §7.
-  (2020-03) Virus Hits Europe Harder Than China. Is That the Price of an Open Society? . New York Times. External Links: Cited by: §1.
-  (2020-03) Cryptographically Secure Contact Tracing. Cited by: §7.
-  (2020) Private Kit: Safe Paths- Can we slow the spread without giving up individual privacy?. Note: http://safepaths.mit.edu/Accessed: 2020-03-23 Cited by: §7.
-  (2020) Apps gone rogue: maintaining personal privacy in an epidemic. arXiv preprint arXiv:2003.08567. Cited by: §3.
-  (1998) Anonymous connections and onion routing. IEEE Journal on Selected areas in Communications 16 (4), pp. 482–494. Cited by: §5.
-  (2018) Carpenter v. United States. Supreme Court of the United States (16-402). External Links: Cited by: §1.
-  (2002) RFID systems and security and privacy implications. In International Workshop on Cryptographic Hardware and Embedded Systems, pp. 454–469. Cited by: §7.
-  (1949) Communication theory of secrecy systems. Bell system technical journal 28 (4), pp. 656–715. Cited by: §3.
-  (1979) Symmetric and asymmetric encryption. ACM Computing Surveys (CSUR) 11 (4), pp. 305–330. Cited by: §5.
-  (2012) Deanonymizing mobility traces: Using social network as a side-channel. In Proceedings of the 2012 ACM conference on Computer and communications security, pp. 628–637. Cited by: §3.1, §4.1.
-  (2002) K-anonymity: a model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 10 (05), pp. 557–570. Cited by: §3.
-  (2020-03) Coronavirus: Israel enables emergency spy powers. BBC News. External Links: Cited by: §1.
-  (2020) Can I say no to uploading my TraceTogether data when contacted by the Ministry of Health?. Note: https://tracetogether.zendesk.com/hc/en-sg/articles/360044860414-Can-I-say-no-to-uploading-my-TraceTogether-data-when-contacted-by-the-Ministry-of-Health-Accessed: 2020-03-23 Cited by: footnote *.
-  (2020) How does TraceTogether work?. Note: https://tracetogether.zendesk.com/hc/en-sg/articles/360043543473-How-does-TraceTogether-work-Accessed: 2020-03-23 Cited by: §4.
-  (2017) Stadium: a distributed metadata-private messaging system. In Proceedings of the 26th Symposium on Operating Systems Principles, pp. 423–440. Cited by: §5.
-  (2015) Vuvuzela: Scalable private messaging resistant to traffic analysis. In Proceedings of the 25th Symposium on Operating Systems Principles, pp. 137–152. Cited by: §5, §5, §5.
-  (1995) Methods used in studies of drink-drive control efforts: a meta-analysis of the literature from 1960 to 1991. Accident Analysis & Prevention 27 (3), pp. 307–316. Cited by: §6.
-  (2020) Response to COVID-19 in Taiwan: Big Data Analytics, New Technology, and Proactive Testing. JAMA. Cited by: §1.
-  (2020-03) 620,000 people installed TraceTogether in 3 days, S’pore’s open source contact tracing app. Mothership. External Links: Cited by: §6.