Doubts about the usefulness of a Corona App remain, even decentralised variants involve considerable risks – FIfF presents DPIA update in English at https://www.fiff.de/…

The debate about the data protection-compliant design of a corona app has intensified in recent days. The app digitally supports the so called "contact tracing" which intends to break COVID-19 infection chains by warning people who have been exposed to someone tested positive. Initially, the only goal pursued by the German government was to introduce an app with a warning functionality for those potentially infected, but in the meantime, further purposes beyond tracing are being discussed which would cause more infringements of fundamental rights. However, there are still general doubts about the effectiveness of digital contact tracing for containing the pandemic, as the discussion about false positives caused by e.g. walls, masks or varying Bluetooth signal strengths shows. The accusations that pushing such a corona app project primarily signals political actionism or that the project might accustom the general population to future tracing or tracking projects by government bodies have not yet been dispelled.

In the course of the current discussion about a ’stay at home‘ order exit strategy, the use of a corona app has been considered strategic in other countries and is now also being considered by the German government. The German Minister of Health, Jens Spahn, has recently switched his preference from a centralized, and from a data protection point of view riskier architecture, to a decentralized model. Austria and Switzerland have already adopted the decentralized DP-3T implementation. With the publication of a DPIA, we are pursuing the goal of informing the discussion about the far-reaching consequences of these decisions and contributing to making this app as data protection-friendly as possible.

Materials regarding the DPIA (Creative-Commons-License: Attribution, CC BY 4.0 Int.):

One of the central questions relevant to data protection is: How is the purpose limitation of the overall system secured and enforced? How can misuse, especially by the operators, be prevented by technical, organizational, and legal means? It will be decisive for the success of a data protection-friendly Corona App to restrict the purpose solely to informing potentially infected persons. In our view adding other purposes such as epidemiological studies, an immunity pass function, or detailed quarantine monitoring poses disproportionate risks and infringements of fundamental rights and is therefore not justifiable.

The question of centralisation vs. decentralisation is of crucial importance for data protection due to the following circumstance: In a central architecture, an almost ‚omniscient‘ server coordinates all procedural activities; It collects all contact events from infected users and notifies persons at risk. In a decentralized architecture, however, the server has no access to the contact events of users. It only stores non-identifying infection indicating data. The apps themselves detect possible infection events; the necessary calculations are performed on the devices of the respective users. If a government agency were to be given blanket access to contact events of infected and non-infected persons, this would not only be a considerable violation of data protection, but also a collection of data that is simply not necessary for the purpose, i.e. a violation of the principle of data minimization. "So far, the European Parliament, Germany, Austria, Ireland and Switzerland have spoken out in favour of a decentralised variant, whereas France still favours the centralised one. The FIfF would like to urgently point out the danger that a centralised system will be followed by extensive possibilities for subsequent use, which generates considerable potential for abuse." warns Kirsten Bock from the FIfF.

A decentralised model is clearly preferable to a centralised one, but it is also not free of serious data protection risks. Therefore, the FIfF now presents a model data protection impact assessment (DPIA) for decentralised architectures. In doing so we refer to a requirement under Art. 35 of the General Data Protection Regulation (GDPR), which is directed towards the future controller of such data processing. The purpose of this model DPIA is to demonstrate in a publicly accessible way the risks for data subjects. "It needs to be underlined that the data protection risks also affect persons who do not use the app themselves", says Rainer Mühlhoff, FIfF e.V. Furthermore, with this document we present recommendations for the (re)design of the app and the processing procedure as well as protective measures concerning a whole list of possible weaknesses and attacks.

"With this DPIA, we have set a new standard that others whose data processing creates high risks for fundamental rights and freedoms have to meet from now on." comments Rainer Rehak from FIfF. "And we are also showing that DPIAs must be published as a matter of principle so that society can discuss these risks in an informed manner and exert pressure on those responsible to protect our basic rights when processing data," adds Jörg Pohle, also from FIfF.

With this DPIA, now completely available in English, we intend to enrich the pan-European discussion on data protection. Data protection, not privacy, is the guarantor for the protection of all fundamental rights in the digital age.

Über den Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung e.V.

The Forum Computer Scientists for Peace and Societal Responisbility (German: Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung – FIfF) is a Germany-wide association of about 700 people who critically examine the effects of using of computers and information technology within the digital society. Our members work predominantly in IT-related professions, from IT systems electronics engineers to professors of theoretical computer science. Since 1984, the FIfF has been working towards a reflected use of information technology for the benefit of society in technical and non-technical areas. We pursue our goals via e. g. public discourse, political consulting or the development of technical studies. In addition, the FIfF publishes the quarterly "FIfF-Kommunikation – Journal for computer science and society" and cooperates with other civil rights organisations and the peace movement.

Firmenkontakt und Herausgeber der Meldung:

Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung e.V.
Goetheplatz 4
28203 Bremen
Telefon: +49 (421) 336592-55
Telefax: +49 (421) 336592-56
http://www.fiff.de

Ansprechpartner:
Ingrid Schlagheck
FIfF Geschäftsstelle
Telefon: +49 (421) 336592-55
Fax: +49 (421) 33659256
E-Mail: fiff@fiff.de
Für die oben stehende Pressemitteilung ist allein der jeweils angegebene Herausgeber (siehe Firmenkontakt oben) verantwortlich. Dieser ist in der Regel auch Urheber des Pressetextes, sowie der angehängten Bild-, Ton-, Video-, Medien- und Informationsmaterialien. Die United News Network GmbH übernimmt keine Haftung für die Korrektheit oder Vollständigkeit der dargestellten Meldung. Auch bei Übertragungsfehlern oder anderen Störungen haftet sie nur im Fall von Vorsatz oder grober Fahrlässigkeit. Die Nutzung von hier archivierten Informationen zur Eigeninformation und redaktionellen Weiterverarbeitung ist in der Regel kostenfrei. Bitte klären Sie vor einer Weiterverwendung urheberrechtliche Fragen mit dem angegebenen Herausgeber. Eine systematische Speicherung dieser Daten sowie die Verwendung auch von Teilen dieses Datenbankwerks sind nur mit schriftlicher Genehmigung durch die United News Network GmbH gestattet.

counterpixel