As the world looks for a way to manage the spread of the coronavirus while getting back to a more familiar way of living, how can apps provide a solution that is both effective and respects the privacy of all citizens?
It seems that no other app has attracted so much attention over the past weeks as the coronavirus app, described as a powerful tool to curb the spread of COVID-19 so we can go back to normal life. Not every country is ready for it yet. After experiencing a try-out by the Dutch government, it became clear essential items were missing: a clear set of problems that need to be solved and a thorough explanation why this should be done by technology while going through a process of checks-and-balances at the same time.
After a tender squeezed in four days and public demos by the short-list contenders over one weekend, the Netherlands decided to press the reset button after facing strong criticism from technical, privacy and security experts and the privacy authority.
Not only in the Netherlands but all over the world, governments, tech giants, scientists, healthcare organizations, privacy and security experts, the European Commission and regulators stumble in their efforts to launch a coronavirus app to unlock society as soon as possible. The palpable sense of urgency and the desire to act quickly may work for urgent lockdowns to prevent new outbreaks; having this mindset though when developing new technology means entering dangerous territory.
Can we even begin to imagine the cost of ignoring the historical roots of going through a democratic process, of not being able to address fundamental human rights, or of the harmful side effects that could overrun Europe’s strong privacy appetite?
So far, most European countries have indicated they want to protect public healthcare by monitoring the spread of the coronavirus via an app, but details are essential for citizens to visualize the next steps in this journey and where we can expect to see the borders in return for trust in technology.
The GDPR and national legislations dealing with health care issues in the corona crisis can draw these lines.
Trust and confidence are key triggers for the fast and effective adoption of a coronavirus app.
To achieve this, we need answers to important questions first:
Can we still enter public spaces and travel to other countries without using the app at all?
Should we immediately call in sick and remain absent from work after receiving a tracing alert?
Is there enough capacity to test people who expose mild symptoms?
How do we avoid that the app turns into a mass surveillance tool?
Before moving to the next phase of development – and well in advance of downloading the app onto our phones and welcoming it into our lives – we first need to know:
How are social benefits – such as public health protection – designed into the app functionality?
Which technology and embedded controls underpin this vision while protecting constitutional principles such as privacy?
The coronavirus app could be a tool to achieve meaningful purposes and offer public benefits when humans lack the knowledge or capacity to establish that. Looking at the different coronavirus apps around the globe, we can see they display a range of features:
Self-diagnosis and symptom reporting by citizens with follow-up from the healthcare system (further diagnosis, treatment advice, self-isolation).
Contact tracing between individuals who have tested positive and those who are potentially infected.
Monitoring the effectiveness of social distancing and quarantine measures.
The most intense debates focus on contact tracing functionalities as this is something that can easily transform into ‘big brother is watching you’ scenarios. Such setups could come with extensive tracking, profiling and even penalizing citizens for violating quarantine rules by authorities, using the reason of protecting society as justification for these actions.
Asian countries have already demonstrated extensive governmental interference in people’s daily life. Some examples are measuring body temperature of each passenger at stations, checking someone’s health score before allowing entry to public buildings and publishing warnings online to inform the population of the personal activities of individual COVID-19 patients during their incubation period. This includes taking the subway, visits to vet clinics, restaurants and even their cinema seat numbers.
For me, it is obvious that Europe can’t just simply copy this approach of technology-driven control of the coronavirus. Instead, it must reformulate its own plan and adjust it to its own legal standards.
There is a strong public tendency at the moment towards exchanging privacy for more freedom, economic values and fewer confinement measures. But privacy, in the sense of protecting data, is not a currency we can simply exchange for the reopening of society: it is a fundamental human right.
This does not necessarily mean that someone’s individual data should remain private and hidden at all times. However, using personal data to fight the spread of the coronavirus should not lead to outcomes that violate other constitutional rights or harm citizens through discrimination or bias. There is no contrast between privacy and public health protection, they are both essential ingredients.
Cassandra MoonsSr. Privacy Legal Counsel and Data Protection Officer, TomTom
Contact tracing technology that claims to preserve privacy avoids tracking people by not including sensitive (GPS) location data that reveals someone’s exact location. Instead, phones can transmit Bluetooth signals to ‘recognize’ each other within a certain distance.
Let’s assume someone has tested positive for the coronavirus. This event triggers a process to send messages without revealing the patient’s identity to contacts who might have gotten infected. Bluetooth doesn’t seem to be the ideal solution though, as this technology can create a false sense of security. Someone could miss notifications due to signal interference and incompatible devices, or could receive too many alerts, since Bluetooth can travel through protective barriers like walls and windows.
What about the data model?
A decentralized model where data is only stored on a device and removed after the incubation period minimizes data processing, is more secure and, of course, more privacy-friendly. However, is this the most useful solution?
Organizations that can’t access data stored in a central server can’t create valuable aggregated insights for case studies. They are also not able to proactively reach out to people who haven’t used their phone for a long time to warn them.
Another question looks at the level at which technology should work. In the near future, technology might be able to support seamless decentralized contact tracing via Bluetooth between iOS and Android devices.
However, if the second step is to implement contact tracing in the phone’s operating system, this could mean that your phone can always find contacts, regardless of whether there is a coronavirus app installed on it or not. Is this still voluntary use of tracing technology when you decide to uninstall the app but forget to shut down contact tracing through the operating system?
When mobility starts to increase once more, there is a risk for people to get confused when they receive a warning telling them they have been in the vicinity of a COVID-19 patient. One might wonder what happened exactly. Were they sitting too close to their neighbor’s phone while watching TV or to the driver in the car sitting in the nearest lane at the traffic light?
Introducing more context in alerts – such as #station, #school or #supermarket – can increase the effectiveness of the app, including the possible downside of making it easier to identify the COVID-19 patient.
When you look closer, it could also mean reintroducing GPS technology to find out the exact location of a contact event. This may lead to more privacy sensitivity, mitigated by appropriate safeguards to prevent the reidentification of individuals as much as possible.
GPS data is also more accurate compared to the option of people scanning QR codes that are displayed in crowded areas such as restaurants, trains and concert venues. The latter fails to reassure people of whether they have come too close to a COVID-19 patient or not. Another factor to consider is that QR codes might not work in open spaces like parks or shopping streets.
As an alternative to GPS, Bluetooth contact tracing could also work as a parking sensor. Useful to prevent infections, this approach would allow people to immediately tag and save that contact event or dismiss it when it comes from the person standing near a window, outside their home.
Whichever option we will choose in the end, it will not be possible to find adequate technology without evaluating use cases that provide context and using data ethics as a balancing factor to find the right approach.
Another complex dilemma is whether or not we want to execute ‘smart’ lockdowns with less impact in the near future. Medical experts are very clear that the coronavirus will come in waves until we have a vaccine and data may provide fruitful insights for authorities to take more differential confinement measures. For example, they could only close those areas and places that have been validated as infection hotbeds.
To define these spreads, we could think of deploying geofencing technology to measure whether or not the contact event took place within a predefined geographical area such as a park, shopping mall or station.
What is geofencing?
Learn more in this article
Dispatching test capacity and equipment to regions with high numbers of contact events to prepare for new outbreaks is another relevant use case where data can play a pivotal role.
In these unprecedented times of fighting a pandemic, we can steer debates in society to the right track by taking a closer look at the problems inevitably linked to the protection of public health after the lockdowns.
We should also consider what data citizens want to control and how public health protection can start to dance with legal principles and human rights such as privacy.
Assessing use cases and digital ethics remain critical, like an orchestra playing the music to dance to, while statutory buffers such as GDPR can lay out the dancing floor.
Another relevant topic to think about is how we want to manage ‘smart’ lockdowns in future outbreak rounds with less harmful effects on the economy and society as a whole.
Most importantly, it should be clear to every citizen why a coronavirus app can make this all work.
Want to learn more about privacy in the context of location technology?
Get in touch.