B. Technology in service of democracy and fundamental rights

Privacy and the protection of personal data are essential for our freedom and security. If governments or companies infringe too deeply upon our privacy, we are prevented from thinking freely, speaking freely, and exchanging ideas freely. This leads to conformism. An all-seeing government stifles diversity and creativity in society. If companies know too much about us, we are exposed to the risk of having our opinions and preferences manipulated. Privacy is not only an individual right, but also a common good.

Smart cities are only really smart when they handle personal data carefully. They need to have a good reason to collect and process personal data and they must be able to explain this. This follows from the European Union's General Data Protection Regula­tion (GDPR) and its underlying principles: lawfulness, fairness, transparency, purpose limitation, data minimisation,[1] accuracy, storage limitation, integrity, confidentiality, and accountability. Cities must demand the same from the companies they work with. Con­tracts with companies partnering with the smart city must be public, especially in con­nection with tasks in which personal information is collected and processed. The trans­parency principle of the GDPR is at stake here.

The supervisory role of municipalities does not have to be limited to their own organisa­tions and the companies they contract. They can make agreements about privacy and data protection with all companies and institutions operating within the municipal bound­aries.[2] Rules that apply to everyone can be laid down in local regulations, for example concerning the use of sensors in the public space.

Some companies treat personal data as merchandise. However, rewarding people for their data puts them to an improper choice between economic gain and preserving their privacy. Trade in personal data undermines privacy as a common good and leads to a society in which the rich have more privacy than the poor. Municipalities should not provide support to companies that purchase or resell personal data, whether they are start-ups or tech giants.

Even when governments legally collect and process personal information in the perfor­mance of official tasks, they should seek opportunities to give citizens as much control as possible over their personal data. For example, by offering a privacy-friendly alterna­tive in situations where showing a passport, identity card, or driving licence is currently required.

The open source app IRMA (I Reveal My Attributes) enables citizens to reveal proper­ties (attributes) of themselves without disclosing personal information that is not relevant in the situation at hand. Thus, citizens can fill out municipal web forms without having to enter their official digital identity code; IRMA allows them to prove that they are residents of the municipality. At the door of a nightclub, ‘over 18’ and a digital passport photo are the only personal attributes that are needed to get in. These are the only data the bouncer gets to see upon reading out the QR code on the mobile phone of youngsters who have the IRMA app.
The more companies and governments facilitate the use of IRMA, the less often people need to cede their name, address, passport number, or national identification number. That enhances their privacy and reduces the risk of identity fraud.
[3]

Initiatives such as IRMA show that governments can use technology to give citizens more control over their data. However, governments also use technology to gain more control over citizens. When it comes to combating benefit fraud, the principle of purpose limitation – personal data may be used only for the purpose for which it was ceded or for a compatible purpose – has become virtually meaningless. Governments feed algorithms with a wide range of personal data, from dog ownership to holiday destinations, in order to assign secret risk profiles to benefit recipients.[4] Those who are profiled as high-risk are prima facie suspects, to be sub­jected to investigation.

A society in which your socio-economic status determines the extent to which you are entitled to privacy and data protection is guilty of class injustice. Discrimination lurks every time people get a risk profile that is not based on their individual behaviour, but on group characteristics. Moreover, if every contact you have with a government produces data that might be repurposed to assign a risk profile to you, public trust in government erodes. Also, the support for useful technological innovations might crumble if citizens find out that their data is being used improperly: "Your waste card tells us that you produce a lot of waste. We are here to check whether you are in fact entitled to a single person's allowance." That is why natio­nal and local politicians must prevent the data dragnet for profiling from being cast too widely. Select before you collect: governments need to demonstrate the necessity and proportionality of the use of each category of personal data, especially when it concerns special personal data, regarding health for example.

Footnotes

Further viewing

Presentation: Lina Dencik (Data Justice Lab), Citizen scoring in the United Kingdom Afspelen op YouTube
Presentation: Ronald Huissen (Bij Voorbaat Verdacht), Citizen scoring - the SyRI court case Afspelen op YouTube
Video: Radboud University, IRMA: back in control of your personal data For up-to-date info: Privacy by Design Foundation Afspelen op YouTube

Further reading

Dossier: United Nations Human Rights, Landmark ruling by Dutch court stops government attempts to spy on the poor

GEF

This project is organised by the Green European Foundation with the support of Wetenschappelijk Bureau GroenLinks (NL), Green Economics Institute (UK), Institute for Active Citizenship (CZ), Etopia (BE), Cooperation and Development Network Eastern Europe and with the financial support of the European Parliament to the Green European Foundation.

Logo Green European Foundation