Magistrates, lawyers, data protection specialists and digital law experts warn of the risks that an application for tracking patients and their contacts presents in terms of individual freedoms. These specialists offer a list of specific guarantees.
Prime Minister Edouard Philippe announced on Tuesday April 28 his strategy for progressive deconfinement, the implementation of which must begin on May 11, after almost two months of strict confinement intended to stem as much as possible the epidemic of coronavirus which left more than 20,000 dead in France. He postponed the debate and the vote on an application for tracking patients and their contacts, which is not yet finalized.
While the government is considering the implementation of the StopCovid mobile application, magistrates, lawyers, data protection officers, lawyers, recognized experts in labor law and data protection warn, taking into account the risks, of the guarantees that such an application – or any similar device likely to be implemented in the medium term – will necessarily have to provide in terms of data protection.
They express themselves here freely.
In view of the upcoming deconfinement of the French population, many technological projects are being studied with the aim of stemming the spread of the pandemic. Whether their stated objective is to map the spread of the virus or even to list and report cases of infection, the systems envisaged are based on tracking people, collective or individual, in particular through the use of data from mobile terminals (mobile phone, tablets) and questionnaires relating to the health of the persons concerned.
The use of these devices, in such a wide way and in such a context, is unprecedented. Their relevance gives rise to a lively debate between, on the one hand, the supporters of the protection of personal data and, on the other hand, the promoters of an innovative technical response to the epidemic risk. These two objectives are not mutually exclusive. Certainly, neither the right to the protection of personal data, nor the right to privacy are absolute. However, taking them into account is a necessity in a democratic society.
The implementation of tracing devices cannot therefore be envisaged without strong guarantees, taking into account the risks involved.
We would like to draw attention in this regard to two major risks:
• Risk of reliability failure : technology is never a quick fix. Like any innovative technology, the risk of a “false positive” must be scrupulously assessed, that is to say the risk that the machine is mistaken (this has already been demonstrated, for example, with regard to facial recognition), which could lead – as the Cnil noted in its opinion of April 24, – to people wrongly identified as at risk being forced to unjustified restrictions on their individual freedoms. Similarly, increased vigilance is required when an application is based on a technique known as data anonymization. Many of these devices have demonstrated in the past that there was a risk of re-identification of the persons concerned without their knowledge, as was the case in 2015 with a project to quantitatively estimate pedestrian flows on the La Défense floor.
Regarding the StopCovid application, the aforementioned opinion of the CNIL confirms that the data collected will not be anonymized, but only “pseudonymized”, that is to say that the re-identification of people will remain possible. The prior assessment of the reliability of the system and the risk of re-identification must therefore guide technological choices.
• Risk of misuse of purpose : as noted by the National Pilot Digital Ethics Committee in a note dated April 7, “the collection and processing of data in order to ensure monitoring could present a significant risk of arbitrariness, in particular of misuse, extension of access or extension of purposes, whether by public authorities or private actors“The Cnil has always recalled its great vigilance with regard to very high volume files, called” population files “, whose misuse can occur and lead to a risk of stigmatization, even exclusion, of the persons concerned. What, for example, would be the consequences of the computer exploitation of lists of names of people likely to have been infected by the virus?
The precise identification of the objectives pursued by these tracking devices and the implementation of guarantees, including legal ones, aimed at preventing the risk of unintended uses are also essential.
To do this, we call to anchor precisely and clearly the objectives pursued by any tracking technology in a specific legal basis aimed at guaranteeing respect for the following rights and principles:
• No discrimination. The principle of volunteering must be established and accompanied by the ban on the implementation of discriminatory measures against people who choose not to submit to tracking devices or who cannot do so for technical reasons
• Integrity and confidentiality. The highest security standards are applied to guarantee the integrity and confidentiality of data, in particular concerning the encryption algorithms used, as well as the conditions for hosting centralized data.
• Right to erasure. Tracing technologies are only implemented for a period limited to what is strictly necessary, while respecting the right to be forgotten.
• Data protection by design and by default. The measures intended to ensure that data protection is taken into account from the design stage and by default are identified beforehand and incorporated into the specifications used to develop the tracking technologies.
• Transparency. Measures to ensure respect for the rights of individuals, in particular prior information on the exact conditions of use of the data, the right of access and the right to erasure, are identified and explained beforehand.
• “Accountability”. The operators authorized to process the data are clearly identified, a sharing of responsibilities between them is determined and a data protection impact analysis (PIA) is drawn up and made public before any deployment of tracking technologies, with periodic reassessment their effectiveness in achieving the objective pursued.
• Independent control. The authorities responsible for ensuring the defense of rights and freedoms (Cnil, but also Defender of rights, Anssi, etc.) are systematically consulted in advance, and associated throughout the duration of the implementation of the tracking solutions. Their opinions must be made public.
• Right to an effective remedy. Independent judicial review of the system.
As the Cnil requests in its opinion, it is important, in our opinion, for the regulator to be re-entered to communicate a precise and detailed opinion on the exact contours – which are not currently known – of the StopCovid application.
In the fierce struggle launched by the public authorities, doctors and researchers to overcome the disease, it is our collective responsibility not to give up the fundamental right to the protection of personal data enshrined in the Charter of Fundamental Rights of the European Union and guaranteed by the European data protection regulation (GDPR) as well as by the Data Protection Act. Faced with the Covid-19 crisis, it is the fragile balance between security, freedom and the right to be forgotten that it is up to us to preserve. The protection of our health and economic capital should not, in this context, be done to the detriment of our private life capital.
The signatories: Guillaume DESGENS-PASANAU, magistrate, associate professor at Cnam, former jurist of the Cnil; Nathalie METALLINOS, lawyer, former lawyer of the Cnil; Jeanne BOSSI-MALAFOSSE, lawyer, former lawyer of the Cnil; Elise LATIFY, legal director and consultant, former lawyer of the Cnil; Alexandra GUÉRIN-FRANÇOIS, personal data protection consultant, former CNIL lawyer; Stéphane PETITCOLAS, RGPD consultant, external DPO and former CNIL expert engineer; Xavier LEMARTELEUR, legal information technology manager, former lawyer at the Cnil; Odile JAMI-CASTON, Director of Personal Data Protection, former lawyer of the Cnil; Myriam QUEMENER, magistrate; Michel MINÉ, professor of Cnam; Dominique ROUX-ROSSI, professor emeritus of universities; Jérôme HUET, university professor; Fabrice NAFTALSKI, lawyer; Sophie REVOL, lawyer; Anne SENDRA, lawyer; Guillaume FLAMBARD, lawyer; Matthieu BERGUIG, lawyer; François COUPEZ, lawyer; Nathalie LAMBERT, former lawyer, secretary general, specialized in information technology and data protection; Sylvie ROZENFELD, editor-in-chief of the review Expertise.