Journalists of RISE Project from Romania, an investigative media outlet, were threatened with fines in the amount of 20 million EUR if they do not allow the access to personal data, thus 18 organizations for the protection of digital rights and freedoms and SHARE Foundation among them, sent a letter to Andrea Jelinek, the Chair of the European Data Protection Board. The letter was also sent to the president of the National Supervisory Authority for Personal Data Processing of Romania, as well as to the European Commission.
Namely, on Facebook page of the RISE Project, journalists published documents containing personal data of a well-known Romanian politician and persons connected with him, and pointed at a high level of corruption in terms of misuse of the EU funds. Soon afterwards, the Project received a written request of the Romanian Supervisory Body asking for the information on the source of the data and threatening with a fine.
The organizations warned that such request of the Romanian body for personal data protection jeopardizes the secrecy of journalistic sources and reminded that the General Data Protection Regulation (GDPR) a legislative framework for the protection of rights established by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights.
Recitals 4 and 153 and Article 85 of the GDPR make clear that the right to protection of personal data must be considered in relation to its function in society and be reconciled with other fundamental rights, such as the right to freedom of expression and information.
Signatories of the letter asked the Board to consider whether the request of the Supervisory Body of Romania is in line with the GDPR and whether the Romanian Law on Personal Data Protection 190/2018 and its application in this case reconcile the personal data protection and freedom of expression and information in line with Article 85.
More than 70 organizations from all around the world, with SHARE Foundation among them, signed an open letter to the Chief Executive Officer of Facebook, Mark Zuckerberg with a request of ensuring transparency and liability for the content removal process on this social network. The letter requests from Facebook to clearly and precisely reveal how much content it removes, with or without basis, as well as to enable its users to fairly and timely make complaints to removed content in order for the content to be back online as soon as possible in case of a mistake.
The letter asks for adoption of “Santa Clara principles” and points to the fact that many high-profile figures such as politicians, museums, or celebrities managed to bring their content back on Facebook thanks to all the media attention. For most ‘regular’ users, this is not the case, because Facebook allows complaints to content being removed only in certain circumstances. Beside the implementation of a more efficient mechanism for complaints, Facebook is also requested to publish reports on transparency and to include details such as the type of removed content, the initiation of moderation actions and the number of wrong moderation decisions regarding the content removal.
“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with. Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook,” said Nate Cardozo from Electronic Frontier Foundation (EFF).
GDPR Today, launched on 25 October, is your online hub for staying tuned to the (real) life of EU data protection law. The project will monitor the implementation of the law across Europe by publishing statistics and sharing relevant news around key subjects.
GDPR Today, led by several EDRi member organisations, aims to complement our association’s past support for the data protection reform.
Katarzyna Szymielewicz, vice-president of EDRi and co-founder and president of Panoptykon Foundation
Behind GDPR Today there are several civil society organisations who work together under the umbrella of European Digital Rights – EDRi, an association which supported the EU data protection reform. The initiative will prioritise building knowledge around legal guidelines and decisions, data breaches, new codes of conduct, tools facilitating individuals’ exercise of rights, important business developments and governmental support for data protection authorities. The GDPR Today is an instrument aimed at data protection experts, activists, journalists, lawyers, and anyone interested in the protection of personal data.
Our goal with GDPR Today is to present facts to the public on the implementation of the law, so that those interested can follow how the GDPR is both shaping the EU digital market and helping people regain control over their personal data.
Estelle Massé, Senior Policy Analyst and Global Data Protection Lead at Access Now
Legal restriction enabling the police, information agencies or private companies to enter the privacy of citizens only when it is prescribed by law has been deleted from the Bill of the Law on Personal Data Protection. The Bill was adopted by the Government of Serbia at its session on 24 September and it is currently in the parliamentary procedure.
Following Article 23 of the new European General Data Protection Regulation – GDPR, Article 40 of the Draft which was an object of public debate since 1 December 2017, explicitly stipulated that the citizens’ rights related to insight, deletion, change and other measures of control over the processing of their data ‘may be restricted by law’ in cases such as: protection of national security, defense, public safety, rights and freedoms of others, etc.
However, the obligation of such restriction being prescribed by law was deleted from the Bill submitted to the Parliament for adoption. This would practically mean that state bodies or private companies processing personal data of citizens may restrict the rights of citizens arbitrarily and without any explicit legal authorization.
Wording which makes everyone processing personal data of citizens obligated to act in line with law is not a phrase which can be omitted because the obligations of processors are implied. On the contrary, a random restriction of rights can be prevented only by an explicit provision of the law which strictly defines when processor may restrict the rights of citizens.
This obligation is also a part of Article 42 of the Constitution of the Republic of Serbia, stating that collecting, keeping, processing and using personal data is regulated by law (paragraph 2), as well as that ‘everyone shall have the right to be informed about personal data collected about them, in accordance with the law, and the right to court protection in case of their abuse’ (paragraph 4).
We wish to remind that this is not the first time that our legal system includes a solution contrary to constitutional provisions. Namely, on 30 May 2012, at the proposal of the Commissioner for Information of Public Importance and Personal Data Protection, the Constitutional Court passed a decision determining that parts of Articles 12, 13, and 14 of the current Law on Protection of Personal Data are not compliant with the Constitution of the Republic of Serbia because they enable restricting the rights of citizens based on ‘another regulation’. Having in mind Article 42 of the Constitution, the Constitutional Court concluded that only ‘law can regulate collecting, keeping, processing and using of data’, therefore any option of regulating this field based on ‘another regulation’ is unconstitutional.
Such decision of the Constitutional Court suggests that the new definition of Article 40 of the Bill could face the same fate, too.
Since the restrictions from the Bill are in fact copied from the GDPR, it is important to understand the intention of the European legislator. Namely, the new regulatory framework primarily enabled the EU member states to, in their own regulations specify restrictions when it comes to the rights of citizens which certainly does not mean that such restrictions are necessary. For example, the overview of laws of the EU member states implementing the GDPR showed that some of them, such as Germany, Austria, Sweden or Croatia either do not have any specific articles referring to restrictions of citizens’ rights or these restrictions are narrowly defined.
The laws on personal data protection in Germany and Austria contain restrictions only in parts referring to data processing for police and defense purposes, purposes of criminal acts investigations, conducting criminal sanctions and similar, whereas the Law on Implementation of the General Data Protection Regulation in Croatia, does not contain articles referring to restrictions of the rights of citizens. Legislators in Sweden left a possibility of passing additional laws to prescribe restrictions of the rights of citizens in the context of Article 23 of the GDPR.
We have been waiting for the new Law on Protection of Personal Data, to be adopted for quite long, because it is supposed to ensure new rights to citizens, such as the right to correction, addition, deletion, restriction of processing and portability of data, just like in the General Data Protection Regulation – GDPR.
Since the Bill has already been submitted for adoption, SHARE Foundation urges the members to use amendment of the Parliament to return to the initial wording of Article 40.
Find more on citizens’ rights guaranteed by a new legislative framework in our Guidebook My data – my rights.
SHARE Foundation signed the Declaration for Europe and joined the Copyright4Creativity coalition, which advocates for a new approach to copyright in Europe which can benefit all, foster innovation, incentivise and reward creativity and improve the availability of products of the European creative spirit.
Signatories of the Declaration call upon the European Commission, European Parliament and EU Member States to adjust copyright exceptions with the following principles:
Harmonise Exceptions Across Europe
Act as a Spur to Innovation
Support User Creativity and Wider Participation
Ensure Accessibility by all Europeans
Support for Education and Research
Facilitate Preservation and Archiving
Ensure Monopoly Rights are Regulated in the Online Environment
Promote these Principles in International Discussions
“Upload filters would seriously hamper our right to freely receive and impart information on the internet and would effectively make private online platforms the arbiters of speech”, explains Danilo Krivokapić, Director of SHARE Foundation.
On 12 September, the European Parliament adopted changes to the draft Copyright Directive, which is a huge overturn of the Parliament’s previous position on the text which did not pass the vote in July. Negotiations of the Parliament and the Council of the European Union are to come, with mediation by the European Commission, and the final step is the vote on the negotiated text of the Directive in the European Parliament, which is expected in January next year.
Articles 11 and 13, which would mean the introduction of upload filters and “link tax”, remained in the current draft of the Directive. If this version is adopted, the Directive, based on Article 13 would practically make internet platforms, expect the smallest ones, liable for copyright infringement of their users’ uploads. In practice, these companies would be forced to install technical filters, which often mistake legal content (parody, satire) for something that infringes copyright, says Julia Reda, Member of the European Parliament for Pirate Party Germany.
“Negotiations will start between the Parliament and the EU Council: a proposal that coerces internet companies into monitoring, filtering and blocking our uploads versus one that more explicitly forces internet companies into monitoring, filtering and blocking our uploads. The result will be a cocktail of both poisons, to be put to a final vote just a few short months before the 2019 European Parliament elections”, said Diego Naranjo, Senior Policy Analyst at European Digital Rights (EDRi).
Regarding Article 11, it seems its basic purpose is to contribute to the revenue of media organisations with a “link tax”, so that they would keep authorship rights which would give them exclusive rights to monetise that content, which is currently not the case. Article 11 would only allow sharing “individual words” of media articles without a fee, even in hyperlinks. This approach still needs to be considered because there are many possible challenges for implementation. A similar law was adopted in Germany and its effects did not increase revenue of publishers and media organisations – it only made the situation more complicated as the law is hard to interpret, explained Julia Reda.
In recent years, there have been attempts to introduce internet blocking and filtering in Serbia, which were luckily unsuccessful. The Administration for Games of Chance appeared as the possible censor, first in 2012 when it sent a letter to Serbian internet service providers (ISP) to urgently block access to foreign gambling websites which didn’t have a licence issued by the Administration. ISPs were threatened with legal sanctions, primarily with the criminal liability for illegally organising of games of chance, if they do not comply and block the websites. A similar case happened two years later when an ISP received a letter, this time from the Tax Administration, also with the demand to block foreign gambling websites.
The most serious attempt to legally introduce filtering of internet content happened in late 2014, when the proposed amendments to the Law on Games of Chance contained a provision which would force all ISPs in Serbia to block foreign websites which did not have a licence issued by the Administration for Games of Chance. After reactions by SHARE Foundation and civil society, the controversial proposal was pulled from legislative procedure.
To see why we consider copyright important in the digital age, have a look at our video below.
California State Assembly adopted a new law with the goal to guarantee net neutrality. California Internet Consumer Protection and Net Neutrality Act 2018 forbids the practices of electronic communications providers which include throttling, blocking or other interference with the internet traffic of their users, as well as providing prioritised speed to some apps compared to others. Vote in the state Senate is to come, and after that the law awaits the signature of Governor Jerry Brown.
Maybe the key moment for support of net neutrality came after complaints from the Santa Clara County fire department, which claimed that Verizon, one of the biggest ISPs in the US, throttled their internet traffic during great fires in California, even though they had an “unlimited” data plan.
SHARE Foundation prepared an educational video on the importance of net neutrality for freedom of expression and other digital rights.
Today, 3 April 2018, SHARE Foundation, along with 93 civil society organisations from across the globe, sent a letter to the Secretary General of the Council of Europe, Thorbjørn Jagland. The letter requests transparency and meaningful civil society participation in the Council of Europe’s negotiations of the draft Second Additional Protocol to the Convention on Cybercrime (also known as the “Budapest Convention”) —a new international text that will deal with cross-border access to data by law enforcement authorities. According to the Terms of Reference for the negotiations, it may include ways to improve Mutual Legal Assistance Treaties (MLATs) and allow “direct cooperation” between law enforcement authorities and companies to access people’s “subscriber information”, order “preservation” of data and to make “emergency requests”.
The upcoming Second Additional Protocol is currently being discussed at the Cybercrime Convention Committee (T-CY) of the Council of Europe, a committee that gathers the States Party to theBudapest Convention on Cybercrime and other observer and “ad hoc” countries and organisations. The T-CY aims to finalise the Second Additional Protocol by December 2019. While the Council of Europe has made clear its intention for “close interaction with civil society”, civil society groups are asking to be included throughout the entire process—not just during the Council of Europe’s Octopus Conferences.
“Transparency and opportunities for input are needed continuously throughout the process. This ensures that civil society can listen to Member States, and provide targeted advice to the specific discussions taking place. Our opinions can build upon the richness of the discussion among States and experts, a discussion that civil society will miss if we are not invited to participate throughout the process” — the letter reads.
Current negotiations raise “multiple challenges for transparency, participation, inclusion and accountability,” despite the fact that the Council of Europe’s committees are traditionally very inclusive and transparent. We are requesting the T-CY to:
“develop a detailed plan for online debriefing sessions after each drafting meeting, both plenary and drafting, and to invite civil society as experts in the meetings, as is customary in all other Council of Europe Committee sessions. With a diligent approach to making all possible documents public and proactively engaging with global civil society, the Council of Europe can both build on its exemplary approach to transparency and ensure that the outcome of this process is of the highest quality and achieves the widest possible support.”
The letter was coordinated by European Digital Rights (EDRi) and the Electronic Frontier Foundation (EFF) with the help of IFEX, Asociación por los Derechos Civiles (ADC), Derechos Digitales, and Association for Progressive Communications (APC).