News

Digital rights organizations ask for transparency in content removal on Facebook

More than 70 organizations from all around the world, with SHARE Foundation among them, signed an open letter to the Chief Executive Officer of Facebook, Mark Zuckerberg  with a request of ensuring transparency and liability for the content removal process on this social network. The letter requests from Facebook to clearly and precisely reveal how much content it removes, with or without basis, as well as to enable its users to fairly and timely make complaints to removed content in order for the content to be back online as soon as possible in case of a mistake.

The letter asks for adoption of “Santa Clara principles” and points to the fact that many high-profile figures such as politicians, museums, or celebrities managed to bring their content back on Facebook thanks to all the media attention. For most ‘regular’ users, this is not the case, because Facebook allows complaints to content being removed only in certain circumstances. Beside the implementation of a more efficient mechanism for complaints, Facebook is also requested to publish reports on transparency and to include details such as the type of removed content, the initiation of moderation actions and the number of wrong moderation decisions regarding the content removal.

“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with. Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook,” said Nate Cardozo from  Electronic Frontier Foundation (EFF).

Related content

Covid-19 apps: Opening the new Pandora’s Box

The Covid-19 epidemic was not only a test of public health systems around the world, it was also a test of policy makers in the field of information society and personal data. The test was simple: «Did we learn anything from our previous mistakes where we tried to decide between personal safety and personal privacy or […]

Pandemic politics in the Western Balkans

In response to the COVID-19 pandemic, countries around the world have introduced various legal measures and technological solutions, which have raised particular concerns for the respect of human rights during this global public health crisis. In such circumstances, privacy and personal data protection were among the first victims, while other rights, such as freedom of […]

Digital Rights Summer School: Where Is AI Leading Us

The second Digital Rights Summer School was held from July 23rd to 29th in Perast, Montenegro, with more than 50 participants and lecturers coming together to exchange and acquire knowledge about current issues at the intersection of society, technology, and human rights. The Digital Rights Summer School is organised by the SHARE Foundation in cooperation with the European […]