Pandemic for digital rights in South East Europe

Pandemic for Digital Rights – Report

The global public health crisis brought on by the Covid-19 pandemic confirmed that the decades-long discussion on striking a better balance between interests of safety and privacy still hasn’t provided the world with a better framework. Concentration of information, censorship, fake news, security breaches and the government officials response to these violations were some of the most notable takeaways from the report. 

Since 2014, SHARE Foundation has been running the Digital Rights Monitoring Project in an effort to sample violations and assess overall conditions in the online sphere of Serbia. Last year, the project was expanded in cooperation with the Balkan Investigative Reporting Network (BIRN) to include monitoring of incidents in Bosnia and Herzegovina, Croatia, Hungary, Kosovo, Montenegro, North Macedonia and Romania. Given the current global situation, the first joint report coincided with the Covid-19 pandemic, and this led to the uncovering of some worrisome events and trends in the region.

The report presents an overview of the main violations of citizens’ digital rights in each country in the period from 31 January to 30 September 2020. Following the analysis, a list of recommendations for authorities is proposed in an attempt to curb gross digital rights violations in future situations of social crisis.



Read more:

  

SHARE: Complaints against 16 global tech companies

On Thursday, October 1, SHARE Foundation’s legal team filed misdemeanor complaints against 16 global tech corporations following their failure to appoint representatives in Serbia for more than a year, as is mandated by the Law on Personal Data Protection.

Companies that own platforms for providing various services, such as Facebook, Twitter, Amazon, Netflix or Airbnb, process huge amounts of Serbian citizens’ personal data, while citizens are not able to directly exercise their rights, and are instead left to engage in automated communication with robots.

Last fall when the new data protection law came into force, SHARE Foundation launched a campaign to inform citizens about their rights, as well as companies about their obligations. The first misdemeanor charges were filed against Google and Facebook. After a long correspondence with the Serbian Data Protection Commissioner, Google LLC was the first and so far the only company among the tech giants that appointed its representative in Serbia. Mark Zuckerberg’s corporation did not oblige the Commissioner with an answer.

Thanks to Google’s representative in Serbia, we were recently able to record the first domestic case of a successfully realised right to be forgotten. Of the “smaller” global corporations, the representative in Serbia was appointed by the owner of the commercial flight search service eSky, while the Dutch owner of the Serbian-language platform KupujemProdajem already had a local representative.

Misdemeanor complaints were filed with the Commissioner for Personal Data Protection, authorised to initiate an inspection procedure and impose fines in the amount of 100.000 Serbian dinars (RSD) for the company and 20.000 RSD for its director in case there is a violation of the law.

Unlike the European General Data Protection Regulation, based on which our law was written, the fines in Serbia are symbolic, especially for global companies that make unimaginable profits off of the data of citizens around the world. However, we believe that they would show that the competent authorities of the Republic of Serbia apply the law that protects our citizens when companies do not operate in accordance with domestic regulations.

Misdemeanor complaint (.pdf)



Read more:

SEE Digital Rights Network established

Facing a rise in digital rights violations, more than a dozen rights organisations have agreed to work together to protect individuals and societies in Southeast Europe.

Nineteen organisations from Southeast Europe have joined forces in a newly-established network that aims to advance the protection of digital rights and address the growing challenges posed by the widespread use of advanced technologies in society.

Initiated by Balkan Investigative Reporting Network, BIRN, and SHARE Foundation, the SEE Digital Rights Network is the first network of its kind focused on the digital environment and challenges to digital rights in Southeast Europe.

The network brings together 19 member organisations – from Bosnia and Herzegovina, Croatia, Greece, Kosovo, Montenegro, North Macedonia and Serbia – dedicated to the protection and promotion of human rights, both online and offline.

Each is committed to advancing their work on issues of digital rights abuses, lack of transparency, expanded use of invasive tech solutions and breaches of privacy.

Since the onset of the COVID-19 pandemic, Central and Southeast Europe has seen a dramatic rise in the rate of digital rights violations, in countries where democratic values are already imperiled.

“This endeavour comes at a moment when we are seeing greater interference by state and commercial actors that contribute to the already shrinking space for debate while the exercise of basic human rights is continuously being limited,” said BIRN regional director Marija Ristic.

“The Internet has strong potential to serve the needs of the people and internet access has proved to be indispensable in times of crisis such as the COVID-19 pandemic. Our societies are becoming more digital, which presents a powerful incentive to increase the capacity of organisations dealing with digital developments and regulations in our region.”

During a first joint meeting, the members of the network agreed that the challenges posed by the fast-evolving tech solutions used by states have led to infringements of basic rights and freedoms, while false and unverified information is flourishing online and shaping the lives of people around the region. The online sphere has already become a hostile environment for outspoken individuals and especially marginalised groups such as minorities, LGBTIQ+ community, refugees and women. 

“Digital technology is profoundly changing our societies as it becomes an important part of all spheres of our lives, so we see the diversity of organisations that joined this network as one of its biggest strengths,” said Danilo Krivokapic, director of the SHARE Foundation.

“We can learn so much from each other’s experience, as we have similar problems with governments using technology to exert control over society, especially in times of crisis such as the COVID-19 pandemic,” he said. “It is also important that we act together when we are trying to restore the balance between our citizens and big companies (Facebook, Google etc) that hold enormous amounts of our personal data and through this exert significant power over us.”

The network’s aim is to build on the skills, knowledge and experience of its members to achieve common goals such as strengthening democracy in the region and protecting individuals in the digital environment.

While cherishing the values of safety, equality and freedom, the work of the SEE Digital Rights Network will be directed at achieving the following goals: to protect digital rights and internet freedoms, enable people to access accurate information, make the internet a safer place, detect and report hate speech and verbal violence online, especially against women and other vulnerable groups, identify online recruitment, which can lead to exploitation, take control of personal data, work to prevent the implementation of intrusive surveillance systems, hold governments accountable for the use and abuse of technology and improve digital literacy in order to prevent violence and exploitation.

The network will aim to increase the level of understanding of complex and worrying trends and practices, trying to bring them closer to the general public in a language it can understand. By creating a common space for discussion and exchange, organisations and the media will be able to increase the impact of their individual efforts directed towards legislative, political and social changes.

The organisations that have joined the network are as follows (updated in January 2024):


Read more:

Pandemic politics in the Western Balkans

In response to the COVID-19 pandemic, countries around the world have introduced various legal measures and technological solutions, which have raised particular concerns for the respect of human rights during this global public health crisis. In such circumstances, privacy and personal data protection were among the first victims, while other rights, such as freedom of expression and information, followed soon. The Western Balkans are no exception – during the pandemic, there were many cases of violations of digital rights and freedoms, which threatened to further reduce the overall human rights situation riding on the public fear of a major health crisis.

The policy paper “State of pandemonium: Digital rights in the Western Balkans and COVID-19” authored by Danilo Krivokapić, Bojan Perkov and Davor Marko aims to point out how major crises affect basic democratic achievements and highlight that the pandemic must not be used under any circumstances to irreversibly reduce human rights standards, especially not by using intrusive technologies. The authors’ findings highlight that there are already many problems in the Western Balkans regarding digital rights and freedoms, especially when it comes to privacy and security of personal data, disinformation and attacks on journalists, which only worsened during the COVID-19 pandemic. 

The authors point out that the adequate implementation of policies and regulations in the fields of data protection and information security, enabling the development of a favorable environment for unhindered work of journalists and media, as well as improving digital literacy and digital competencies of citizens are steps of key importance for the future.

Read more:

Covid-19 apps: Opening the new Pandora’s Box

The Covid-19 epidemic was not only a test of public health systems around the world, it was also a test of policy makers in the field of information society and personal data. The test was simple: «Did we learn anything from our previous mistakes where we tried to decide between personal safety and personal privacy or are we going to to the same mistake once again?»

The short anwer: «Here we go again!»

The Covid-19 apps were one of the key features of the coronavirus first wave. «There’s an app for that!» exclaimed the industry as different countries were busy implementing mobile apps as the last line of defence against the virus that was spreading globally. But despite absolutely no proof that the apps are indeed stopping the coronavirus from spreading or that they are privacy-aware and with mounting evidence stating the exact opposite, several of the global states were ready to go with mobile apps that claimed to do the impossible – protect the people, limit exposure to the virus and stop the virus from spreading.

Slovenia was not one of those countries. In the first wave of coronavirus we were too busy installing a right-wing government and enabling them to pull of a multi-million euro heist of public money with a strange chain of business transactions related to the public procurement of surgical masks to be focusing on the enabling of a digital surveillance state. We reserved that for the second coronavirus wave.

Despite multiple calls from the Information Commissioner, legal scholars, privacy NGOs and concerned individuals the government bundled the legal framework for an obligatory coronavirus app into a massive law that dealt with everything from unemployement subsidies to government compesantions for businesses affected by the coronavirus. The law made it virtually impossible to reject the obligatory coronavirus tracking app without also canceling out the government coronavirus help for the economy and citizens.

The debate in the parliament showed the lenght to which the right-wing coalition was willing to go justifying the coronavirus tracking app that was made obligatory for every infected person and for every person that was ordered to go into the quaratine by the government.

All the arguments about the coronavirus apps being ineffective based on reports from Germany, Iceland, Israel and other countries that already implemented such a technocratic solution for a social issue, the arguments about the problematic digital divide that will hamper the efforts since not everbody will have the necessary equipment or the necessary digital skills to use the app, the arguments about the app creating a false sense of security were cast aside.

At the same time, the coalition used several fallacious arguments to convince the public and themselves we absolutely need an app for coronavirus tracking. Among the arguments was the evergreen yet totally fallacious argument about our personal data being already used by Facebook, Google and other internet giants and that there is absolutely no reason for the government not to have the same privilege. The proponents of the coronavirus law were also claiming the app will save thousands of lives (another classic yet fallacious argument that privacy is the opposite of security) and even went so far as to invent a completely bogus claim about the Slovenian police force surveilling the individual parlamentaries, in some cases even right before midnight mass (a claim that was debunked as a complete lie AFTER the vote already passed) to show that privacy is dead, everybody is spying on everybody and the privacy-aware activists and public have no right to oppose the legalisation of the app based on those principles.

The reaction of the general public already dissatisfied with the right-wing government that used the first wave of coronavirus to misappropriate public funds for the purchase of protective equipment was severe and the day after the vote the Minister of Public Administration went on to explain that the provisions of the law stating the app will be obligatory are not going to be used in practice. The coronavirus tracking app will therefore be completely voluntary. The government however, the Minister went on, will reserve the right to use the obligatory provision «if deemed necessary».

The question remained – which app will the government use? The public call for app development offers came on a Sunday and the deadline to submit the application was only three days, spurring assumptions that the government already has an app developed and needed the law to implement it. Ultimately, they decided to base the Slovenian corona tracing app on the well established German open source Corona-Warn-App that will be translated into Slovene language and adjusted to allow the Slovenian authorities to use it. The cost of the project? 4000 euros.

But if the the app was completely voluntary, why did they bother with legalising it in the first place? So far, we don’t know. All we know is that the government posseses a very strong tool to financially punish and even imprison people who are infected with coronavirus or who are violating quarantine and don’t use the prescribed mobile app.

The sad part is, that’s not the biggest problem. The problem that apparently does not want to go away is the fact that we are being led by public representatives conviced that the digital technology will be the answer to social problems, that the surveillance state is being justified by lies and fear mongering and that the politicians are ignoring the epidemiologists, privacy experts and the general public in order to push for repressive legal framework that is not even needed to achieve the stated goal. Alas, there is no app for that. 



Domen Savič is the director of the Citizen D NGO, based in Ljubljana. In his capacity he is focusing on the protection of human rights in the information society, analysis of the media landscape, development of media literacy programs and the encouragment of active citizens.

Read more:

SHARE Interview: Ella Jakubowska on biometric mass surveillance

Ella Jakubowska from European Digital Rights (EDRi) spoke to SHARE Foundation’s Filip Milošević and shared her thoughts on the dangers of biometric mass surveillance for human rights and freedoms.

Filip: Hey Ellie. I can’t hear you…

Ella: But I can hear you, wait. – I’ve got a funny desk with different levels but I can use some books just to make sure…- It’s fine.

Filip: So you can just tell us who you are and what you do.

Ella: My name is Ella Jakubowska and I am a policy and campaigns officer at European Digital Rights, EDRi.

Filip: Thanks for being with us, Ellie. Recently in Serbia we started having these issues of possible mass surveillance. They are installing lots of cameras around the city. Seeing your paper I see that you’ve done a good job of defining some of the core problems. There are several of them. So maybe we can have a few sentences about all of them together and try to explain the most core problems that people should understand, when it comes to mass surveillance and facial recognition technology.

Why biometric mass surveillance should be banned


Ella: We see so many risks and threats posed by this growing and uncontrolled use of facial recognition and other biometric processing that it’s almost hard to know where to start. Because, when we think about the type of society that we want to create and the world that we want to live in, it feels like biometric mass surveillance is the complete antithesis to a world that we want if we’re thinking about that in terms of our rights and our freedoms as citizens.

Citizens’ rights and freedoms


Ella: When we start having systems installed throughout our public spaces which we have rights to enjoy, to use as places to express ourselves, to protest, to hold power to account suddenly the whole environment changes if there are cameras trained on us all of the time. It’s being done across Europe and across the world in currently such an unaccountable untransparent way. We know that different cameras are being equipped to process our sensitive data in a really wide range of ways. For example, if that data is combined with other data the power relations that structure our society are changing. What I mean by that is – if we’ve got these cameras being installed throughout our cities and there is a blurring of who is responsible for developing them, for setting the parameters that they might pick up people based upon, and then storing, processing that data and matching it with other data sources – suddenly we no longer know who is capturing that data about us we don’t know which private actors might be involved in that and what sort of influence they might have over our governments and the people whose role it is to keep us safe. And, really, throughout the biometric industry you see a lot of private actors whose motivation is to make money and so when they’re taking sensitive things like our faces and our bodies there are just so many ways that that can be used against us. If we don’t know what’s happening, if we’ve not got clear evidence that these systems have been introduced in ways that are safeguarded and with protections for us as citizens and individuals which right now we’re really not seeing then it really opens the door for a lot of different shady actors to be watching us and building up patterns of our movements. If you belong to a community that already gets overpoliced or watched, surveilled to a high degree, so that could be people of colour, that could be people from certain religious groups that could also be human rights defenders and political dissidents the idea then that both public and private actors can suddenly build up this picture of where you go and who you meet with is actually very dangerous because that can be used to target you even more so. So the real problem for us is that those who already have disproportionate amounts of power stand to gain more and more and those that are already in positions of powerlessness will be made even more powerless by this dynamic of who gets to watch and who is watched. And this is really frightening because it means that these private actors who want to make money from our faces and our bodies and from watching us will have a really high level of control and influence over our governments and may have more technical knowledge than our governments… It’s a really complex web of different actors who are gaining power but the ones that lose power then are the citizens. We no longer have control and freedom in our public spaces lose our ability to be anonymous in public which is really fundamental to our ability to be involved in democracy and to express ourselves freely.

Public space


Ella: If we have less diversity in the people that represent us and in the voices that are heard in our communities and all we hear are the rich and the powerful and the highly educated that have the knowledge of these systems it’s really not the kind of world that we want to create or the kind of society that would benefit the vast majority of people. The changes in people’s behaviour when they’re being constantly watched have been well substantiated and if you extrapolate that onto a societal level and you think about how we might all change our behaviour if we know we’re being watched… It doesn’t mean that we were doing anything wrong, but it means that we’ll suddenly become very aware of what we’re doing. That’s where these things start having a chilling effect because if we’re all suddenly aware that there are cameras trained on us all the time that things could be used against us, we no longer feel so comfortable expressing how we feel, we might choose to stop meeting with certain people because how it looks… We will change how we go about our lives.

Freedom of expression


Ella: There’s a real sense of empowerment from being able to express yourself differently and suddenly, if you’re forced to conform, this composes a real threat to your identity. It really challenges your sense of dignity and who you are as a person and who you’re allowed to be in your society in a way that’s very dangerous. What we’ve concluded in our paper, as EDRi, is that that creates a slippery slope towards authoritarian control. Having a mass surveillance society that wants to put us all in boxes will really dangerously disincentivize people from being able to be individuals and instead will create societies of suspicion and fear and a sense that everybody is a potential suspect. Function creep is one of the really big problems that we see with these mass surveillance infrastructures especially when they use our face data or other sensitive data about our bodies, our health and who we are as people. We know for a fact that once these systems and structures are in place they will be reused in ways that they were not initially intended for and that means that safeguards will not have been put in place for these new uses.

Function //// creep


Ella: So we know that even from an economic point of view it looks good for a government to say: “We already got these systems, we can now do all these great shiny technological things with them. Why would we waste it? Why would we not do more and more?” And from a human rights point of view that’s absolutely terrible because that’s being driven by the technology rather than being driven by the sort of societies that we want to create and by thoughts of how we protect people and we create vibrant democratic communities that everyone can take part in. And these techno-solutionist ideas can often be pushed by private actors. Again, that speaks to who’s really in charge. Who’s got the power over our public spaces. And if we don’t know who’s got the power how can we hold them to account? Linked to that, normalisation is another really big problem that we see because even a use that might be, from a fundamental rights perspective, less dangerous like unlocking your personal phone where you control the data, nothing leaves your device, it still creates this sense that our faces are something to be commodified our faces are something that we can use in lieu of a password and actually that’s not the case. Our faces have a really special quality because they’re so linked to our identity that if we start seeing them as interchangeable with a password we start really undermining the value and that actually poses a lot of questions about our dignity as human beings our autonomy as human beings. And as we see more and more private actors coming in to try and find ways to monetize this data that is being collected on us really, it means that by becoming comfortable everyday letting our faces be used in all sorts of applications, we’re giving a carte blanche to private companies to commodify and objectify our faces and use them to sell us things, to infer things about us and predict and make judgements about us which could then be used to control us.

Anonymity


Ella: Once we have all these different systems in place that can track us across time and place we also have the fact that multiple databases can be introduced that can all fit together. So, suddenly, this intimate pictures of who we are, not just showing where we go and who we interact with but they’re linked to our faces and our bodies in a way that means we can never be anonymous because the moment our face is detected suddenly this picture of us and maybe how we walk and therefore what health problems we might have will be linked. It could be linked with our criminal records. It could be linked with our personal data, our health data, our online browsing… There are really massive potentials for these different databases to be layered on top of each other. And suddenly we’ll have these systems, and in some cases we already do have systems that just know vast amounts of things about us that can so easily be used against us. Once you’re being tracked across time and place and especially once various databases are being brought in, in very opaque ways, there is suddenly the possibility for very authoritarian methods of social control. China is a very good example of this. It’s been quite widely reported over the last few years that they have introduced social credit score, linked to people’s identity. They’re then being either rewarded or punished for doing things like buying alcohol or interacting with family compared to interacting with known political dissidents. And people’s scores are then being used to control their access to their fundamental rights. Are they allowed to leave the country? Are they able to get car insurance? So, suddenly, it’s not just that your life is being watched it’s also that your life is being analysed and someone far away is making a judgement about whether what you’re doing is in line with their vision of control. And often it’s not a “someone”, it’s an algorithm which adds a whole other layer of opaqueness and a whole lot of dangers for how biases can be embedded in technology. Any society that looks to stratify people based on how they look, based on their health, based on their data and things about them, is an incredibly authoritarian and sinister society. The societies throughout history that have tried to separate and stratify people based on data about them are the sort of authoritarian societies that we want to stay as far away as possible from. We think that if governments are really going to listen it needs pressure from all parts of society. It needs people to be holding power to account to be calling out surveillance when they see it and contributing to civil society organisations and the activists that are trying to reveal these secretive rollouts and that are trying to make sure that this is something that is there for public debate and for all of us to decide, not for private companies who want to make money out of us to decide, not for police forces who want to save money and cut corners to decide. This is for our societies and communities and it needs to be something that we all collaborate on together.

SHARE has brought Google to Serbia

Any requests and objections which the citizens of Serbia may have regarding their personal data processed by Google, can now be resolved through the company’s representative in Serbia. Google, as one of the first tech-giants complying with the new Serbian law, wrote a letter to the Commissioner for Information of Public Importance and Personal Data Protection, i.e. Serbia’s Data Protection Authority, on May 21st, 2020, stating that their representatives would be “BDK Advokati” from Belgrade.

YouTube, Chrome, Android, Gmail, maps and many other digital products without which the internet is unimaginable, are an important segment of the industry which entirely relies on processing personal data. With a significant delay and numerous difficulties, states have begun bringing some order in this field, which directly interferes with basic human rights. The European Union has set this standard by adopting the General Data Protection Regulation (GDPR), while the new Law on Personal Data Protection in Serbia, in application since August 2019, followed this model too.

Although they have been operating in Serbia for a long time, global tech-corporations observe most developing countries as territories for an unregulated exploitation of citizens’ data. At the end of May 2019, three months before the application of the new Law on Personal Data Protection, SHARE Foundation informed 20 biggest tech companies from around the world about their obligations towards the citizens of Serbia whose data is being processed.

Twitter responded to us by saying that they were working on it. A global platform for booking airline tickets, eSky, contacted us and appointed their representative in Serbia. In December 2019, we filed misdemeanor charges to the Commissioner.



Read more:

hiljade.kamera.rs: community strikes back against mass surveillance

Serbian citizens have launched the website hiljade.kamera.rs as a response to the deployment of state-of-the-art facial recognition surveillance technology in the streets of Belgrade. Information regarding these new cameras has been shrouded in secrecy, as the public was kept in the dark on all the most important aspects of this state-lead project.

War, especially in the past hundred years, has propelled the development of exceptional technology. After the Great War came the radio, decades after the Second World War brought us McLuhan’s “global village” and Moore’s law on historic trends. Warfare itself has changed too – from muddy trenches and mustard gas to drone strikes and malware. Some countries, more than others, have frequently been used as testing grounds for different kinds of battle.

Well into the 21st century, Serbia still does not have a strong privacy culture, which has been left in the shadows of past regimes and widespread surveillance. Even today, direct police and security agencies’ access to communications metadata stored by mobile and internet operators makes mass surveillance possible. 

As appearances matter most, control over the flow of information is a key component of power in the age of populism. We have recently seen various developments in this context – Twitter shutting down around 8,500 troll accounts pumping out support for the ruling Serbian Progressive Party and its leader and the country’s President Aleksandar Vucic. These trolls are also frequently used to attack political opponents and journalists, exposing the shady dealings of high ranking public officials. Reporters Without Borders and Freedom House have noted a deterioration in press freedom and democracy in the Balkan country.

However, a new threat to human rights and freedoms in Serbia has emerged. In early 2019, the Minister of Interior and the Police Director announced that Belgrade will receive “a thousand” smart surveillance cameras with face and license plate recognition capabilities, supplied by the Chinese tech giant – Huawei. Both the government in Serbia and China have been working on “technical and economic cooperation” since 2009, when they signed their first bilateral agreement. Several years later, a strategic partnership forged between Serbia’s Ministry of Interior and Huawei, paving the way to the implementation of the project “Safe Society in Serbia”. Over the past several months, new cameras have been widely installed throughout Belgrade.  

This highly intrusive system has raised questions among citizens and human rights organisations, who have pointed to Serbia’s interesting history with surveillance cameras. Sometimes these devices have conveniently worked and their footage is somehow leaked to the public, and in some cases, they have not worked or recordings of key situations have gone missing, just as conveniently. Even though the Ministry was obliged by law to conduct a Data Protection Impact Assessment (DPIA) of the new smart surveillance system, it failed to fulfil the legal requirements, as warned by civil society organisations and the Commissioner for Personal Data Protection

The use of such technology to constantly surveil the movements of all citizens, who are now at risk of suddenly becoming potential criminals, has run counter to the fundamental principles of necessity and proportionality, as required by domestic and international data protection standards. In such circumstances, when there was no public debate whatsoever nor transparency, the only remaining option is a social response, as reflected in the newly launched website. 

“Hiljade kamera” (“Thousands of Cameras”) is a platform started by a community of individuals and organisations who advocate for the responsible use of surveillance technology. Their goals are citizen-led transparency and to hold officials accountable for their actions, by mapping cameras and speaking out about this topic to the public. The community has recently started tweeting out photos of cameras in Belgrade alongside the hashtag #hiljadekamera and encouraged others to do so as well.

The Interior Ministry has yet to publish a reworked and compliant Data Protection Impact Assessment (DPIA) but the installation of cameras continues under sketchy legal circumstances.



Bojan Perkov is a Policy Researcher at the SHARE Foundation. His interests and areas of work include freedom of expression and online media, as well as all other issues related to online expression such as hate speech, net neutrality, censorship, etc. Twitter: @Bojan_Perkov.

Read more:

The right to privacy in the time of coronavirus: freedom’s last line of defence?

Dr Mihajlo Popesku, Head of Research, Auspex International
Catalina Bodrug, Research Scientist, Auspex International


Earlier this month, Auspex conducted two large-scale online surveys1 in the UK and Italy, focusing on residents’ behavioural and emotional responses to the Covid-19 pandemic and resulting lockdown, with samples of 2,001 respondents in each country – representative by age, gender, region and socioeconomic class. 

As part of our analysis, we were able to identify various groups or segments in each country, with tendencies to engage in either constructive or destructive behaviours: those who panic and despair, those who remain calm and optimistic, and those who thrive and flourish in isolation. A full infographic report of our findings is provided here. One of the most interesting insights, for Share Foundation readers, was that people in both countries strongly reject the idea of data monitoring as a means of tackling the spread of the coronavirus.

We asked both British and Italian respondents to rate the acceptability of eleven actual and potential government interventions. An overwhelming majority of British and Italian residents were prepared to countenance certain measures to contain the epidemic, including closing pubs/restaurants (82% UK, 78% Italy), washing their hands for 20 seconds (84% UK, 73% Italy) and enforced staying at home (both 71%). For Britons, however, the monitoring of personal data ranked as the least acceptable measure, with an approval rating of just 16.7%. In Italy, the situation was not dissimilar, with the measure ranking second to last with an approval rate of 23.2%. What is more, residents of both countries are more likely to favour curfews and remaining in lockdown over having their personal data tracked.

These insights suggest that both Italians and Britons are aware of the importance and sensitivity of data protection rights, and that any attempt to infringe their privacy is generally regarded as the ultimate loss of freedom.

Our next exercise focused on understanding the differences between those people who accept and those who reject the monitoring of personal data. For this purpose, we merged the two samples, to try to find regularities and patterns, regardless of the respondents’ nationality. 

People who accept data monitoring (20%) are members of a more mature segment, with 1 in 3 aged over 65. Their emotional response to the crisis is ambivalent, with a mix of increased anxiety and happiness being most often reported. This is a very alarmed and anxious segment, with 2 in 3 reporting that their country is in a state of emergency. Compared to a month ago, this segment now feels more positive about government authority, wants to spend more time with family, feels more positive toward homeschooling, strongly supports closing borders to foreign visitors, is increasingly interested in social justice and “woke” activism, and feels more patriotic, creative and self-reliant. They believe that the best thing is for the country to remain united, and to get behind the Prime Minister, government and institutions – even if it means taking drastic actions to help tackle the spread of the disease. They see Covid-19 as a very serious situation, in which everyone is at risk. Fighting the coronavirus is a team effort, requiring mass compliance and the use of all means necessary – even if this involves the monitoring of personal data. On average, people in this group are better informed about Covid-19, are more compliant, and have engaged more frequently in constructive behaviours. In the event of institutional collapse/meltdown these people would help others and attempt to repair the damage. Their values are duty and tradition. Their lifestyle centres around travelling, exploration and education.

People who reject data monitoring (80%) make up a younger segment. These individuals report that the pandemic has inspired mostly the worst in them. They are feeling increasingly deflated, bored, stressed or tired as a result of the crisis. They are significantly more concerned with isolation and loneliness. This group is more likely to have experienced negative feelings such as anger or resentment over the government’s interventions, and in the event of civil unrest due to Covid-19, they are more likely to engage in mass protests or to leave the country. They also show significantly less support for and trust in institutions, and are more likely to be lax in complying with official instructions. They seem to be in a vulnerable position, as they were somewhat more likely to be professionally affected by the pandemic. This segment scored significantly higher in neuroticism, which indicates their fragility, sensitivity and irritability. They need to feel safe and secure. People in their social circle are mostly scared of not having enough to live with dignity. This group likes music, history, computer games and lifestyle content. They care about animal and workers’ rights.

Key takeaways:

  • The monitoring of personal data by government or state institutions as a means of tackling the spread of coronavirus is very unpopular, with the vast majority of both Italians and Britons finding it unacceptable.
  • Those who are supportive of their respective government, men, and members of the older generation are more likely to accept the monitoring of their personal data.
  • Those who are most fearful of Covid-19 are also the most likely to accept the monitoring of their personal data.
  • Data privacy appears to be the ‘final line of defence’ of personal freedom, as people are more willing to accept curfews and to be confined to their homes than they are to lose their privacy and have their personal information monitored or tracked – whatever the reason behind it.


1The study was fully anonymous and no personally identifiable information was collected. Our respondents came from a mix of 70 different access panels in the UK, and 73 panels in Italy, adjusting for the sample coverage and sampling frame bias.


Dr Mihajlo Popesku, Head of Research at Auspex International in London, is a marketing scientist working on applied social research and statistical modeling of consumer/voter behaviour.

Catalina Bodrug, Research Scientist at Auspex International in London, is working on research design and statistical data analysis. She graduated in Economy and Business at UCL.

Read more:

A Password Pandemic. How did a COVID-19 password end up online?

The username and password to access the Covid – 19 Information System were publicly available on a health institution’s web page for eight days. This period of time was long enough for the page to be indexed by Google and, although invisible on the web page, it was accessible through a simple search. After discovering the matter on the 17th of April, we immediately informed the competent authorities.

Screenshot of the webpage with login credentials for the Covid – 19 Information System

The Covid – 19 Information System is a centralized software for collecting, analyzing and storing data on all persons monitored for the purpose of controlling and suppressing the pandemic in Serbia.


How did we get this data?


Along with the state of emergency, the Government of Serbia introduced numerous measures to tackle the pandemic, which included collecting and processing personal data in the unprecedented circumstances. The Government also informed citizens about these measures by rendering unclear and undetailed conclusions,  none of which specified who was supposed to process the citizens’ data and how.

In an effort to understand the data flow and implications on citizens’ rights, we explored the new normative framework through publicly available sources. By searching keywords on Google, we accidentally discovered the page containing access information for the Covid – 19 Information System. The data was published on the 9th of April.

In addition, we also managed to obtain manuals with instructions for navigating the centralised system webpage.


Which data was at risk?


As per Government’s Conclusion on establishing the Covid – 19 Information System, a significant number of health institutions is required to use the mentioned software to keep records on cured, deceased and tested persons (whether positive or negative), as well as on persons currently being treated, in self-isolation or put in temporary hospitals, including their location data. This system also contains data on persons who are possible disease-carriers due to their contact with other infected persons. The institutions are required to provide daily data updates, as it’s the basis of the diurnal 15 o’clock report read.

While attempting to clarify how our data is being stored, we could not have imagined that we would discover the access password and thus be able to enter the system – just as anyone else who may have found this webpage. It was immediately clear to us that the most sensitive citizens’ data were endangered and that the crucially important integrity of the system cannot be guaranteed in the fight against the pandemic.

We did not log into the system, which would anyway record such an attempt. Instead, we reported the case to competent authorities: the Commissioner for Information of Public Importance and Personal Data Protection, the National CERT and the Ministry of Trade, Tourism and Telecommunications. Being aware of the risk of misuse arising with the accessibility of citizens’ sensitive data, we have decided to notify the public of the incident only after making sure that the authorities had prevented unauthorized access to the system.

Report of the breach sent to competent authorities by email


How did the competent bodies react?


Less than an hour following our report, we were informed that the initial steps were taken as a response to the incident, makings sure that the web page containing the username and the password is no longer publicly available.

Given the scope of the case, we may expect further action from the competent bodies. The Commissioner has the authority to initiate monitoring in line with the Law on Personal Data Protection, the competent ministry is in charge of the inspection monitoring in line with the Law on Information Security, whereas the National CERT has the obligation to provide advice and recommendations in case of an incident.


Who’s to blame?


Aware of the pressure put on health services at the peak of the pandemic, we agreed that, for now, it would be appropriate not to publish the information on the specific health institution in which the incident took place. On the other hand, there is no doubt that the scale of this incident demands that the responsibility for its occurrence is properly determined.

The national legislative framework provides various mechanisms to prevent these kinds of situations, but the occurrences in practice are often far from the prescribed standards. Although they handle particularly sensitive data, health workers are often unaware of all possible risks present in the digital era. Health institutions are required to appoint a data protection officer, but due to limited resources, persons with insufficient expertise and unrelated primary job concerns are usually appointed to this position. In this specific case, the data protection officer may have been a person who takes care of corona-infected persons on a daily basis.

As today’s data protection demands the involvement of an IT expert, this requirement causes an additional burden to the public health institutions’ budget. Sometimes this means that the same person deals with all technical issues within an institution, while being paid far less than their private sector counterparts and without the opportunity to build further information security expertise.

Covid – 19 Information System established by the Government represents a key point in a complex architecture for collecting and processing all defined data. Data collection occurs through different channels, while a single health institution is only a one system entrance point. In such a system, it is rather difficult to implement protection measures at entrance point level, meaning they should be defined at the central level as it would significantly lower the risk of incidents. Based on this case, we have concluded that only one user account was created for each of the health institutions, which does not enable determining individual responsibility for the system misuse.


What should have been done?


Without doubt, this is an ICT system of a special importance within which special categories of personal data are being processed. As such, it implies the necessity to undertake all measures stipulated by the Law on Information Security and the Law on Personal Data Protection in phases of its development and implementation. SHARE Foundation explored these measures to a great detail in its Guidebook on Personal Data Protection and Guidebook on ICT Systems of Special Importance.

By any means, it is necessary to fully implement privacy by design and security by design principles, which entail the following regarding the access to a system:

  • Every system user has their own access account
  • Every system user has the authorisation to process only the data necessary for their line of work
  • Access passwords are not published via an open network
  • A standard on password complexity is put in place
  • The number of incorrect password entries is limited

Our accidental discovery on Google revealed a breach of security and data protection standards within the health system. The state of emergency instituted due to pandemic cannot serve as an excuse for a job poorly done, nor can it serve as an obstacle for conducting an immediate detailed analyses of compliance of Covid – 19 Information System with security standards.


Read more:


Attachment: Data flow in the Covid – 19 Information System