A Password Pandemic. How did a COVID-19 password end up online?

The username and password to access the Covid – 19 Information System were publicly available on a health institution’s web page for eight days. This period of time was long enough for the page to be indexed by Google and, although invisible on the web page, it was accessible through a simple search. After discovering the matter on the 17th of April, we immediately informed the competent authorities.

Screenshot of the webpage with login credentials for the Covid – 19 Information System

The Covid – 19 Information System is a centralized software for collecting, analyzing and storing data on all persons monitored for the purpose of controlling and suppressing the pandemic in Serbia.


How did we get this data?


Along with the state of emergency, the Government of Serbia introduced numerous measures to tackle the pandemic, which included collecting and processing personal data in the unprecedented circumstances. The Government also informed citizens about these measures by rendering unclear and undetailed conclusions,  none of which specified who was supposed to process the citizens’ data and how.

In an effort to understand the data flow and implications on citizens’ rights, we explored the new normative framework through publicly available sources. By searching keywords on Google, we accidentally discovered the page containing access information for the Covid – 19 Information System. The data was published on the 9th of April.

In addition, we also managed to obtain manuals with instructions for navigating the centralised system webpage.


Which data was at risk?


As per Government’s Conclusion on establishing the Covid – 19 Information System, a significant number of health institutions is required to use the mentioned software to keep records on cured, deceased and tested persons (whether positive or negative), as well as on persons currently being treated, in self-isolation or put in temporary hospitals, including their location data. This system also contains data on persons who are possible disease-carriers due to their contact with other infected persons. The institutions are required to provide daily data updates, as it’s the basis of the diurnal 15 o’clock report read.

While attempting to clarify how our data is being stored, we could not have imagined that we would discover the access password and thus be able to enter the system – just as anyone else who may have found this webpage. It was immediately clear to us that the most sensitive citizens’ data were endangered and that the crucially important integrity of the system cannot be guaranteed in the fight against the pandemic.

We did not log into the system, which would anyway record such an attempt. Instead, we reported the case to competent authorities: the Commissioner for Information of Public Importance and Personal Data Protection, the National CERT and the Ministry of Trade, Tourism and Telecommunications. Being aware of the risk of misuse arising with the accessibility of citizens’ sensitive data, we have decided to notify the public of the incident only after making sure that the authorities had prevented unauthorized access to the system.

Report of the breach sent to competent authorities by email


How did the competent bodies react?


Less than an hour following our report, we were informed that the initial steps were taken as a response to the incident, makings sure that the web page containing the username and the password is no longer publicly available.

Given the scope of the case, we may expect further action from the competent bodies. The Commissioner has the authority to initiate monitoring in line with the Law on Personal Data Protection, the competent ministry is in charge of the inspection monitoring in line with the Law on Information Security, whereas the National CERT has the obligation to provide advice and recommendations in case of an incident.


Who’s to blame?


Aware of the pressure put on health services at the peak of the pandemic, we agreed that, for now, it would be appropriate not to publish the information on the specific health institution in which the incident took place. On the other hand, there is no doubt that the scale of this incident demands that the responsibility for its occurrence is properly determined.

The national legislative framework provides various mechanisms to prevent these kinds of situations, but the occurrences in practice are often far from the prescribed standards. Although they handle particularly sensitive data, health workers are often unaware of all possible risks present in the digital era. Health institutions are required to appoint a data protection officer, but due to limited resources, persons with insufficient expertise and unrelated primary job concerns are usually appointed to this position. In this specific case, the data protection officer may have been a person who takes care of corona-infected persons on a daily basis.

As today’s data protection demands the involvement of an IT expert, this requirement causes an additional burden to the public health institutions’ budget. Sometimes this means that the same person deals with all technical issues within an institution, while being paid far less than their private sector counterparts and without the opportunity to build further information security expertise.

Covid – 19 Information System established by the Government represents a key point in a complex architecture for collecting and processing all defined data. Data collection occurs through different channels, while a single health institution is only a one system entrance point. In such a system, it is rather difficult to implement protection measures at entrance point level, meaning they should be defined at the central level as it would significantly lower the risk of incidents. Based on this case, we have concluded that only one user account was created for each of the health institutions, which does not enable determining individual responsibility for the system misuse.


What should have been done?


Without doubt, this is an ICT system of a special importance within which special categories of personal data are being processed. As such, it implies the necessity to undertake all measures stipulated by the Law on Information Security and the Law on Personal Data Protection in phases of its development and implementation. SHARE Foundation explored these measures to a great detail in its Guidebook on Personal Data Protection and Guidebook on ICT Systems of Special Importance.

By any means, it is necessary to fully implement privacy by design and security by design principles, which entail the following regarding the access to a system:

  • Every system user has their own access account
  • Every system user has the authorisation to process only the data necessary for their line of work
  • Access passwords are not published via an open network
  • A standard on password complexity is put in place
  • The number of incorrect password entries is limited

Our accidental discovery on Google revealed a breach of security and data protection standards within the health system. The state of emergency instituted due to pandemic cannot serve as an excuse for a job poorly done, nor can it serve as an obstacle for conducting an immediate detailed analyses of compliance of Covid – 19 Information System with security standards.


Read more:


Attachment: Data flow in the Covid – 19 Information System

Facebook is starting to follow electoral and political advertising in the Balkans

Facebook has announced that it will expand its transparency system and confirmation of authenticity of ads about elections and politics starting from mid-March. Namely, Facebook will cover additional 32 countries, including Serbia and North Macedonia where the elections are to take place very soon.

This turn of events follows the efforts of SHARE Foundation and its international partners to point out to representatives of Facebook the problem of Western Balkans countries being excluded from those where Facebook actively monitors political advertising. This issue is very important in light of the election campaigns in Serbia and North Macedonia, having in mind possible manipulations, the lack of transparency of funding of ads and using non-political pages to advertise for political purposes.

Facebook, Inc. will in this manner expand the transparency of political advertising on their main social networking platform and Instagram in the mentioned countries. Until now, such policies were implemented mainly because of suspicion of foreign interference into election processes during the US presidential elections and Brexit referendum in 2016. The Cambridge Analytica scandal, when data of tens of millions of citizens leaked and pressure from states followed, also pressured Facebook to improve the transparency of its platform.

Facebook Ad Library will provide access information on total advertising expenses, number of ads, as well as data about specific ads – demographic target group of the ad, geographic scope of the ad, etc. In order to analyse political advertising, Facebook will enable researchers, journalists and the public access to the Ad Library API. In addition, by the end of April, it will be possible to download a report for the new 32 countries with aggregated data on ads about elections and politics.

All actors, including political parties, candidates and other organisations wishing to post ads about elections or politics on Facebook and Instagram will be required to register as advertisers, so it can be seen who paid for advertisements. It is also necessary for advertisers to confirm the identity with official documents issued by the state where they wish to publish ads, as well as additional information such as local address, telephone number, email and website if they wish to use the name of a Facebook page or organisation in the disclaimer. In case they do not register, Facebook may restrict posting ads about politics and elections during the verification process.

A dialogue on the Facebook algorithm

During the past few years, Facebook has managed to connect virtually the whole world in a single social app, free of charge and available to anyone on the planet. It has become an inevitable part of social interactions, advertising and political campaigns. SHARE Lab investigated the Facebook Algorithm and uncovered the vast human profiling project in the background.

Featuring: Vladan Joler and Andrej Petrovski
Directed and edited by: Nemanja Babić and Andrija Kovač
Producers: Danilo Krivokapić and Andrej Petrovski
Cinematographers: Andrija Kovač, Danilo Krivokapić and Vladimir Miladinović
Audio: Filip Milošević and Nikola Cvijanović
Production assistant: Milica Čubrilović



Read more:

SHARE files complaints against Facebook and Google

SHARE Foundation filed complaints to the Commissioner for Information of Public Importance and Personal Data Protection of Serbia against Facebook and Google for their non-compliance with the obligation to appoint representatives in Serbia for data protection issues. In May this year, before the start of application of the new Serbian Law on Personal Data Protection, SHARE Foundation sent letters to 20 international companies and called upon them to appoint representatives in Serbia, in accordance with the new legal obligations.

Appointing representatives of these companies is not a formality – it is essential to exercising the rights of Serbian citizens prescribed by Law. In the current circumstances, companies like Google and Facebook view Serbia, like many other developing countries, as a territory for unregulated exploitation of citizens’ private data, even though Serbia harmonized its rules with the EU Single Digital Market by adopting the new Law on Personal Data Protection. Namely, these companies recognise Serbia as a relevant market, offer their services to citizens of the Republic of Serbia and monitor their activities. In the course of doing business, these companies process a large amount of data of Serbian citizens and make huge profits. On the other hand, the new law guarantees numerous rights to citizens in relation to such data processing, but at the moment it seems that exercising these rights would face many difficulties.

Among other things, these companies do not provide clear contact points that our citizens can contact – they mostly have application forms available in a foreign language. Our experience has shown that such forms are not adequate because they require advanced knowledge of a foreign language by Serbian citizens, but also because this type of communication is mostly done by programs that send generic automated responses.

Although fines under the domestic Law on Personal Data Protection that the Commissioner may impose, in this case 100.000 Serbian dinars (around $940 or €850), wouldn’t have a major impact on the budgets of these gigantic companies, we believe that they would show that the competent authorities of the Republic of Serbia intend to protect our citizens and that these companies are not operating in accordance with domestic regulations.

Complaint against Google

Complaint against Facebook

Unlawful video surveillance with face recognition in Belgrade

The Impact assessment of video surveillance on human rights, conducted by the Ministry of Interior of Serbia, did not meet the legal requirements. Also, the installation of the system lacks basic transparency. Hence, the process should be suspended immediately and the authorities should engage in an inclusive public debate on the necessity, implications and conditionality of such a system.

The installation of smart video surveillance in Belgrade, with thousands of cameras and face recognition software, has raised public concern. Three civil society organisations (CSOs) – SHARE Foundation, Partners for Democratic Change Serbia (Partners Serbia) and Belgrade Center for Security Policy (BCSP) – published a detailed analysis of the MoI’s Data Protection Impact Assessment (DPIA) on the use of smart video surveillance and have reached a conclusion that the document does not meet the formal or material conditions required by the Law on Personal Data Protection in Serbia.

The Commissioner for Personal Data Protection of Serbia also published his opinion on the DPIA, confirming the findings of the aforementioned organisations. According to the Commissioner, the DPIA was not conducted in line with the requirements of the Law on Personal Data Protection.

The opportunity to address all issues of public interest through the MoI’s DPIA was missed, as well as the obligation to fulfill both formal and material terms required by the Personal Data Protection Law. The DPIA does not meet the minimum legal requirements, especially in relation to smart video surveillance, which is a source  of most interest and concern of the domestic and foreign public. The methodology and structure of the DPIA do not comply with the requirements of the Personal Data Protection Law because The positive effects on crime reduction as described in the DPIA are overestimated, due to the fact that relevant research and comparative practices have been used selectively. It has not been established that the use of smart video surveillance is necessary for the sake of public safety, or that the use of such invasive technology is proportionate, considering the risks to citizens’ rights and freedoms.

The MoI should suspend further introduction of smart video surveillance systems. In addition, the MoI and the Commissioner should initiate an inclusive public debate on video surveillance legislation and practice that will be in line with a charter on the democratic application of video surveillance in the European Union.

Policy brief – Serbian government is implementing unlawful video surveillance with face recognition in Belgrade

Open Letter: Facebook’s End-to-End Encryption Plans

4 October 2019

Dear Mr. Zuckerberg,

The organizations below write today to encourage you, in no uncertain terms, to continue increasing the end-to-end security across Facebook’s messaging services.

We have seen requests from the United States, United Kingdom, and Australian governments asking you to suspend these plans “until [Facebook] can guarantee the added privacy does not reduce public safety”. We believe they have this entirely backwards: each day that platforms do not support strong end-to-end security is another day that this data can be breached, mishandled, or otherwise obtained by powerful entities or rogue actors to exploit it.

Given the remarkable reach of Facebook’s messaging services, ensuring default end-to-end security will provide a substantial boon to worldwide communications freedom, to public safety, and to democratic values, and we urge you to proceed with your plans to encrypt messaging through Facebook products and services. We encourage you to resist calls to create so-called “backdoors” or “exceptional access” to the content of users’ messages, which will fundamentally weaken encryption and the privacy and security of all users.

Sincerely,

AfroLeadership
Access Now
ACM US Technology Policy Committee
American Civil Liberties Union
Americans for Prosperity
ARTICLE 19
Association for Progressive Communications (APC)
Asociación por los Derechos Civiles (ADC), Argentina
Bolo Bhi
Canadian Internet Registration Authority
Centro de Ensino e Pesquisa em Inovação (CEPI), FGV Direito SP, Brasil
Center for Democracy & Technology
Center for Studies on Freedom of Expression (CELE), Universidad de Palermo
Defending Rights & Dissent
Derechos Digitales, América Latina
Digital Rights Watch
Državljan D
Electronic Frontier Foundation
Electronic Privacy Information Center
Engine
epicenter.works – for digital rights
Fight for the Future
Free Press
Freedom of the Press Foundation
Fundación Karisma, Colombia
Future of Privacy Forum
Global Forum for Media Development
Global Partners Digital
Hiperderecho, Peru
Human Rights Watch
Index on Censorship
Instituto de Referência em Internet e Sociedade (IRIS), Brazil
Instituto de Tecnologia e Sociedade do Rio de Janeiro (ITS)
International Media Support (IMS)
Internet Society
Internet Society – Bulgaria
Internet Society UK England Chapter
Internews
ISUR, Universidad del Rosario, Colombia
IT-Political Association of Denmark
Iuridicum Remedium, z.s.
LGBT Technology Partnership
National Coalition Against Censorship
New America’s Open Technology Institute
Open Rights Group
OpenMedia
Paradigm Initiative
PEN America
Prostasia Foundation
R3D: Red en Defensa de los Derechos Digitales
Ranking Digital Rights
Restore The Fourth, Inc.
Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC)
SHARE Foundation
SMEX
S.T.O.P. – The Surveillance Technology Oversight Project
TechFreedom
Vrijschrift

BIRN and SHARE Join Efforts to Counter Digital Freedom Violations

In Southern and Eastern Europe, where online disinformation campaigns are increasingly endangering guaranteed individual freedoms and a notable decline in internet safety is ubiquitous, BIRN Hub will partner with SHARE Foundation to monitor digital threats and trends in their occurrence, raise awareness about violations of digital freedom and issue policy recommendations.

The organisations will identify the main players involved in disinformation and propaganda by establishing a Digital Monitoring database. The database will cover the state of digital rights in targeted countries by documenting cases of violations of digital rights and freedoms, with descriptions of cases and corresponding sources.

The project, supported by Civitates, will monitor digital freedom violations in Bosnia and Herzegovina, Croatia, Hungary, North Macedonia, Romania and Serbia.

The database will be part of the broader online BIRN Investigative Resource Desk (BIRD), a new resource platform for investigative journalists expected to launch this fall. The interactive database will allow the general public to access data collected through the monitoring system.

The use of SHARE Foundation’s expertise will result in the creation of a detailed methodology and guidelines for monitoring violations of digital rights and freedoms, as well as training for monitors to successfully gather data and file them in the newly created database. A three-day training for monitors will be held in the second half of July in Perast, Montenegro.

In parallel, BIRN journalists will produce and publish five investigations related to the topic. On the basis of monitoring activities, a one-of-a-kind cross-regional report will be produced, to be presented at the closing event.

The database will provide the data for periodical reports on the state of digital rights and freedoms in targeted countries. In terms of outcomes, the cross-regional report will compile collected data in order to introduce public to trends in violations of digital freedoms.

Continuous monitoring and reporting on digital threats will contribute to BIRN’s wider efforts to promote accurate and unbiased information. It will strengthen the capacities and skills of the network’s journalists, as well as exposing and countering threats that journalists and other engaged individuals face on a regular basis.

SHARE calls Facebook and Google to appoint their representatives in Serbia

Three months prior to the application of the new Law on Personal Data Protection, SHARE Foundation asked 20 companies from around the world – including Google and Facebook, to appoint their representatives in Serbia. Competent bodies and citizens of Serbia will thus be able to turn to these representatives regarding all the questions in terms of personal data processing.

Although the business models of these companies area already greatly based on monetization of personal data of their users, and therefore of the citizens of Serbia, too, it seems that they practically still do not have any way to enjoy their rights when it comes to data collected by the most famous companies.

However, the new Law on Personal Data Protection, modelled afterGeneral Data Protection Regulation (GDPR) stipulates the obligation of almost all big IT companies to appoint their representatives in Serbia. Namely, if a company offers products and services in Serbia or if it monitors the behavior of citizens, it must also appoint a representative, i.e. natural or legal entity to which citizens can address regarding their rights of persons to whom data refer. This entity will also cooperate with the Commissioner for Information of Public Importance and Personal Data Protection of the Republic of Serbia. Having in mind that Google, Facebook, Amazon, Netflix and other IT giants process the data of Serbia’s citizens in order to provide services, they are obligated to appoint a local representative.

For example, Google recognized the local market as a significant one years ago, and so many services such as Gmail, YouTube, Google Chrome and Google Search have been adapted to our citizens and available in Serbian language, too. Additionally, Google targets the citizens of Serbia by using advertisements, and monitors their behavior through cookies, therefore, it is certainly obligated to appoint its representative in Serbia. Facebook is also available in Serbian and has about 3 million users in Serbia only on its main social networking site, and it also owns Instagram and WhatsApp. Facebook performs mass collection of users’ data so they could be profiled and shown targeted ads, as it is described in detail in SHARE Lab’s Facebook algorithmic factory research.

However, policies of these companies, most of which have their main headquarters in the USA, basically do not observe Serbia as a part of Europe, which results in a situation that the citizens of Serbia do not have their rights on personal data guaranteed at all. On the other hand, if Facebook or Google appoint their representatives in Serbia, it would be more likely for citizens to exercise their rights or initiate proceedings before competent Serbian authorities. Since the citizens of Serbia enter into agreements with the US companies regarding using the services, while the EU citizens do so with the European representatives, it is obvious that there are parallel systems of protection.

Letters have been sent to the following companies: Google, Facebook, Amazon, Twitter, Snap Inc – Snapchat, AliExpress, Viber, Yandex, Booking, Airbnb, Ryanair, Wizzair, eSky, Yahoo, Netflix, Twitch, Kupujem prodajem, Toptal, GoDaddy, Upwork.

Letter sent to Google
Letter sent to Facebook

Huawei knows everything about cameras in Belgrade – and they are glad to share!

EDIT (30th March, 2019, 9:29h): Not long after this text had been published, case study about cameras for video surveillance in Belgrade was removed from the official website of Huawei. You can read the archived version of the case study at the following link: https://archive.li/pZ9HO.

New generation surveillance cameras have already been installed in Belgrade, as stated in a case study published on the official website of Huawei.

Unlike the Ministry of Interior, whose representatives gave unclear and contradictory statements, to finally refuse a freedom of information request from SHARE Foundation, Huawei published a case study on their company website with detailed information about the installation of cameras for video surveillance in Belgrade and cooperation with the Ministry of Interior of Serbia (MOI).

The case study represents a detailed description of cooperation between Huawei and MOI, which largely contradicts information provided by the Ministry of Interior.

In the beginning, Huawei states that thanks to advanced video surveillance technology, the suspect who fled to China after causing a car accident with fatal consequences in 2015, better known as the “Countryman case” in Serbian public, was apprehended only three days after his photo was received from MOI of Serbia. Thanks to this rapid arrest, the Ministry of Interior initiated cooperation with Huawei through the “Safe Society” project, with the goal to install an advanced video surveillance system in Serbia. The company also points out that it has offered Intelligent Video Surveillance (IVS) systems, Intelligent Transportation Systems (ITS), eLTE broadband trunking technology, unified data centers, and converged command centers to the MOI. It also says that in the beginning, 9 test cameras have been installed in 5 locations, including the MOI headquarters, a sports arena, a commercial center, and a police station. Huawei states that in the first phase, cameras have successfully performed several functions, such as video retrieval, video compression, automatic license plate recognition, behavior analysis, facial recognition, and video quality diagnosis. After a successful test phase, a Strategic Partnership Agreement was achieved in 2017.

In the first phase of the project, 100 high-definition video cameras were installed in more than 60 key locations and the command and data center in Belgrade was remodeled, as pointed out in Huawei’s study. Also, a large number of advanced technologies and products were used, including infrared license plate recognition, 4k video solutions, H.265 HD encoding, cloud-based cluster networking and SafeVideo to ensure data security and virtual checkpoint system.

It should be noted that Huawei had stated that video materials and received data are kept on an advanced storage device called “OceanStore”, which provides a number of options, such as data analysis and big data analysis, and retention period of received data is limited to one year.

In the end, it is said that thanks to realisation of Phase 1, which had been implemented more than five months before the study case was published, many criminals cases were solved, and that the police is now able to find suspects based on the stored video materials thanks to Huawei intelligent technology. As Huawei stated, the Ministry of Interior will develop a comprehensive “Safe City” solution, which will cover the whole Belgrade area in the beginning, while the final goal is to implement such a solution on the whole territory of Serbia.

Finally, the most important question for the citizens of Serbia concerns possible consequences to their privacy, and also the reliability of this technology. It is important to underline that smart technologies which use cameras for video surveillance, like facial recognition and behaviour analysis, represent very intrusive methods for citizen’s privacy, while on the other hand, they are not completely reliable. The technology which may lead to serious personal data abuse is used for storage of data collected by video surveillance. For keeping data in a one year period on the “OceanStore” device, of crucial importance is to establish transparency as to who exactly can access the data, in which cases and so on, because on the contrary, huge amounts of personal information of Serbian citizens may be the target of different abuses. 

As it is publicly known, the Minister of Interior announced the gradual installation of 1000 cameras in 800 locations during the next two years, and the Police Director explained that the future locations of stationary cameras were already known, and that before choosing the locations “significant research and analysis of events were made, foremost on the crimes on the territory of Belgrade”. However, in the reply to our FOI request, MOI stated that “the significant research and analysis” actually didn’t exist. On the other hand, by reading the detailed Huawei case study, it is possible to find information which may provide a better picture about what is actually happening with the process of installing cameras in Belgrade. 

It is very concerning that we can hear completely different information from many different sides about questions that concern constitutional civil rights and freedoms of citizens of Serbia. We believe that relevant actors have to come out to the citizens with accurate and complete information, and to provide the explanation to the public how will a private company be able to access their personal data, in which cases and, most importantly, why weren’t information about cooperation with Huawei available to citizens in the initial phase of the project.

New surveillance cameras in Belgrade: location and human rights impact analysis – “withheld”

Leading Serbian law enforcement officials announced a new system of video-surveillance in Belgrade, the nation’s capital, which would be highly intrusive for citizens. It was revealed that the main partner of the Government of Serbia was Huawei, the Chinese tech giant recently involved in several scandals. In pursuit for transparency of deploying such privacy-invasive technology, SHARE Foundation submitted Freedom of Information (FOI) requests to the Ministry of Interior.

However, the Ministry responded that all documents regarding the public procurement of video surveillance equipment in Belgrade were protected as ‘Confidential’. The information about the new facial and vehicle license plate recognition system was not provided either.

Earlier this year, the Serbian Minister of Interior and the Police Director announced that in the next two years, 1.000 new generation cameras using facial and license plate recognition software will be installed in 800 locations in Belgrade. Minister of Interior Nebojša Stefanović made a statement for Fonet news agency and said that ‘patrol cars and police officers in the street will gradually become equipped with these cameras’, while the network of cameras will then spread to both the highway and regional roads. Police Director Vladimir Rebić made an appearance on Radio-Television of Serbia and stated that the ‘establishing the functionality of face recognition is in its final stage’, and that the locations intended for stationary cameras were already determined based on ‘a broad examination and analysis of events, referring primarily to the criminal offences in Belgrade’.

 

Camera location – “confidential”

 

Based on the Law on Free Access to Information of Public Importance, SHARE Foundation, requested the information on locations of the cameras, including the analysis based on which these locations were determined, and details on the public procurement and relevant procedures.

The official responses of the Ministry stated that all the documents regarding the public procurement of the video equipment are protected as ‘confidential’, and the information on locations and analysis were not provided in any document or any medium, which is the legal precondition to exercise the access to the public importance information. SHARE Foundation requested a copy of the data protection impact assessment (DPIA), and the Ministry responded that the new Law on Personal Data Protection is not being applied yet, and explained that registry and processing of personal data contained in the video surveillance were regulated by the Law on Registry and Processing of Data in Interior Affairs.

Considering the fact that the responses of a FOI officer of the Ministry were in direct contrast to the statements made by the Minister and the Police Director, this ambiguity must be clarified without any further delay, keeping in mind that this is a fundamental question of human rights and civil freedoms guaranteed by law and the Constitution of Serbia.

 

Huawei as the partner of police

 

SHARE Foundation also requested information on the public procurement of equipment and software to be used for video surveillance. The response of the Ministry was that they began discussing possibilities and improvements of information and telecommunications system with Chinese company Huawei in 2011, and that they drafted the decision regarding the increase of general security of citizens within the project titled ‘Safe Society’. This project falls under the Agreement on Economic and Technical Cooperation Regarding Infrastructure between the governments of Serbia and China, previously signed in Beijing in 2009.

Furthermore, the Ministry stated that in 2014 they signed the Memorandum of Understanding with Huawei, referring to necessary steps for the implementation of the above-mentioned project. Based on the Agreement and the Memorandum, in 2017, the Ministry and Huawei signed the Agreement on Strategic Partnership for Introducing eLTE technologies and solutions for a ‘safe city’ in public security systems. The Government of the Republic of Serbia signed this Agreement so that the Ministry of Interior took over the obligations stipulated in the Agreement regarding expenses and procurement of the video surveillance system based on the capital project ‘Video surveillance in traffic – phase II’.

 

Intrusive technologies in “uncharted waters”

 

Serbia adopted a new Law on Personal Data Protection which mostly follows the new standards of European regulations in this field, i.e. GDPR, but at the same time, it does not provide for instruments and mechanisms for the better implementation of the Law. Also, so far there have been no steps taken in terms of drafting a law regulating video surveillance in public space.

Software used for identification is the latest technological achievement which gravely violates rights and freedoms of citizens and is an important topic of public discussions in democratic societies. Chinese company Huawei has been accused on several occasions over the past few years by USA and some European countries of industrial and political espionage in cooperation with Chinese authorities.

In the upcoming months, it is necessary to determine which equipment for video surveillance has been purchased, where and how the personal data will be processed as well as whether the personal data protection impact assessment has been carried out adequately.

Relevant documents:

SHARE Foundation FOI request – statement of Minister Stefanović

SHARE Foundation FOI request – TV appearance of Police Director Rebić

Decision of the Ministry on rejection of the request 

Response of the Ministry