Technology in the fight against coronavirus – 7 pillars of trust

Share

Together with representatives of the digital industry we developed recommendations on how to build technology that inspires trust, not fear of surveillance.

Technological innovations are fascinating research for us. Through their development and implementation, labor productivity increases and better living conditions become possible for more and more of us.

At the same time, “technology is not neutral” – it is created through a complex socio-technical process. Questions about what the purposes of technology are or what data it collects are legitimate. The coronavirus pandemic is prompting innovators to create solutions to help track the infected and protect the healthy. It’s loosening the home quarantine regime of an entire society in favor of flexible digital social distancing. Poland’s Ministry of Digitization is also building such an application, making the code openly available.

Collecting data about society is not necessarily Big Brother surveillance or the biopolitics of the capitalist repressive apparatus. We are of the opinion that the future belongs to the economy of the common intellect – the collective knowledge of the society that allows making rational decisions at the system level, such as the health care system. However, the privacy and dignity of an individual person must not be abandoned in the name of security and development.

The following 7 pillars of trust is an expert appeal prepared by representatives of organizations involved in the digital industry, already signed by dozens of prominent experts.

Technology in the fight against coronavirus – 7 pillars of trust

Technology can help us in the fight against the coronavirus pandemic, but the fast-track implementation of new technological solutions must not become a gateway to violations of citizens’ rights and freedoms. Monitoring social interactions with technological tools will only work with strong support and voluntary cooperation from the people. Such an attitude requires a high level of citizen trust in the tool that will be offered to them by the state.

To meet the need of the moment, we remind the most important principles for designing tools in a way that is RODO-compliant and inspires public trust:

1. Minimization and correctness of data

In order to counter pandemics, only the data that is necessary to fulfill the purpose set by a specific tool can be processed (e.g., if the purpose is to record a social interaction that poses a risk of contagion, recording a pair of devices that “met” without their actual location is sufficient). Furthermore, access to the data should be limited to the individuals and institutions that pursue this goal (e.g., there is no need for a citizen using an app to track social interactions to know who has exposed him or her to infection; only health services should receive such information).
Only the most correct data about people should be used. This is especially important in location-based approaches: processing unverified data would create confusion, as users would receive false notifications of possible infection.

2. Limitation of data retention time

Personal data should not be kept longer than necessary for the specific purpose it is intended to serve. For example, a record of a moving person’s social contacts is only needed for the standard period of time that symptoms of infection appear (up to 14 days), and a quarantined person’s biometric data sent in an app report is only needed until it is verified that the report was correct. Doctors, epidemiologists, and technical experts should be involved in setting maximum time limits.

3. Data stored on the citizen’s device

Coronavirus tools should not lead to the creation of new databases stored on government servers. The principle should be to collect and process personal data on the device of the data subject. The transfer of data to an external server (e.g. managed by the sanitary services) should only take place in specific and justified cases (e.g. an identified infection incident), and the citizen should be informed about this necessity in advance.

4. Security, encryption and anonymization of data

Personal data should be encrypted, anonymized (and where not possible, pseudonymized), and applications using it should maintain the highest security standards and be subject to appropriate audits. As part of the fight against the epidemic, we are threatened by the processing of data of millions of citizens by systems that are created under time pressure. This is a high-risk situation also in terms of cyber security: any leakage will have disastrous consequences for public confidence in the technologies implemented by the state.

5. Clear information for citizens

The basis for public confidence in technological tools to fight a pandemic is clear and understandable information for the people whose data will be used. Citizens should understand how the proposed tools will work, what data will be collected about them, and who will use the data and how.

6. Open code and transparency of algorithms

The standard for pandemic tools should be full openness of the code and algorithms used for data analysis (e.g. parameters taken into account in determining the risk of infection in social contact must be open). Only such a standard will enable public control over technological tools, and at the same time will translate into their safety. Openness of the code allows for quicker identification of errors and possible threats to privacy, which further strengthens the trust in a given application.

7. Public control over tools

Every tool that a country intends to use in the fight against a pandemic should be previously verified in terms of its cyber security and personal data protection standards. For solutions whose code is unavailable for public review, it is up to the state to demonstrate that data security and privacy standards are met. When such a solution is to be used to collect citizens’ personal data, the administration must also have full control over the system.

Signed: (as of publication date 6.04.2020, full list on Panoptykon website at this address – signatures are encouraged at fundacja@panoptykon.org)

Dominik Batorski, Uniwersytet Warszawski

Edwin Bendyk

Piotr Beńke

dr hab. Przemysław Biecek, Politechnika Warszawska

Grzegorz Borowski, Infoshare

Maciej Broniarz, Centrum Nauk Sądowych UW

Maciek Budzich, Mediafun

Dominika Bychawska-Siniarska, Helińska Fundacja Praw Człowieka

Jakub Chabik, Politechnika Gdańska

Łukasz Chmielniak

Mateusz Chrobok

Szymon Ciupa, smartcity-expert.eu

dr Aleksandra Gliszczyńska-Grabias, INP PAN

Krzysztof Głomb, Miasta w Internecie

dr hab. Agnieszka Grzelak, ALK

Adam Haertle, ZaufanaTrzeciaStrona

Natalia Hatalska, infuture.institute

dr inż. Wacław Iszkowski

Krzysztof Izdebski, Fundacja ePaństwo

Łukasz Jachowicz, Internet Society Poland

prof. Dariusz Jemielniak, MINDS, Akademia Leona Koźmińskiego

dr Maciej Kawecki, Instytut Polska Przyszłości im. Stanisława Lema

Filip Kłębczyk, Prezes Polskiej Grupy Użytkowników Linuxa

Wojciech Klicki, Fundacja Panoptykon

Rafał Kłoczko, CD Projekt Red

Michał Kluska

Piotr Konieczny, Niebezpiecznik

dr Michał Kosiński, Stanford University

Nikodem Krajewski

Artur Krawczyk, Stowarzyszenie „Miasta w Internecie”

Robert Kroplewski, Law Office

Eliza Kruczkowska

Artur Kurasiński

Jakub Lipiński, przedsiębiorca

dr Paweł Litwiński, Instytut Prawa Nowych Technologii i Ochrony Danych Osobowych Uczelni Łazarskiego

Artur Marek Maciąg, Inicjatywa Kultury Bezpieczeństwa

Daniel Macyszyn, Fundacja ePaństwo

Mirosław Maj, Fundacja Bezpieczna Cyberprzestrzeń

Lidka Makowska, Kongres Ruchów Miejskich Gdańsk

Marcin Maruta

dr Arwid Mednis, Wydział Prawa i Administracji Uniwersytetu Warszawskiego

Łukasz Mikuła

Natalia Mileszyk, Fundacja Centrum Cyfrowe

Michal Olczak

Igor Ostrowski

dr hab. Marcin Paprzycki, IEEE Computer Society

Bartosz Paszcza, Klub Jagielloński

dr hab. Aleksandra Przegalińska, Akademia Leona Koźmińskiego

Danuta Przywara, Helsińska Fundacja Praw Człowieka

dr Agnieszka Pugacewicz, DELab UW

Małgorzata Ratajska-Grandin, OVHcloud

prof. Wojciech Sadurski, University of Sydney

dr Marlena Sakowska Baryła, Stowarzyszenie Praktyków Ochrony Danych

Wojciech Sańko, Koduj dla Polski

Wiktor Schmidt, Netguru

Aneta Sieradzka, Sieradzka & Partners Kancelaria Prawna

Andrzej Skworz, Press

dr hab. Katarzyna Śledziewska, DELab, Wydział Nauk Ekonomicznych Uniwersytetu Warszawskiego

dr Anna Sledzińska-Simon, WPAiE Uniwersytetu Wrocławskiego

Kamil Śliwowski, Centralny Dom Technologii

prof. Arkadiusz Sobczyk, Uniwersytet Jagielloński

dr hab. Dariusz Szostek

dr Sebastian Szymański

Katarzyna Szymielewicz, Fundacja Panoptykon

dr Alek Tarkowski, Fundacja Centrum Cyfrowe

dr Krzysztof Wygoda, Wydział Prawa, Administracji i Ekonomii, Uniwersytet Wrocławski

prof. Mirosław Wyrzykowski

Arkadiusz Złotnicki, Stowarzyszenie „Miasta w Internecie”

Jan Zygmuntowski, Instrat

Skip to content