News

Digital rights organizations ask for transparency in content removal on Facebook

More than 70 organizations from all around the world, with SHARE Foundation among them, signed an open letter to the Chief Executive Officer of Facebook, Mark Zuckerberg  with a request of ensuring transparency and liability for the content removal process on this social network. The letter requests from Facebook to clearly and precisely reveal how much content it removes, with or without basis, as well as to enable its users to fairly and timely make complaints to removed content in order for the content to be back online as soon as possible in case of a mistake.

The letter asks for adoption of “Santa Clara principles” and points to the fact that many high-profile figures such as politicians, museums, or celebrities managed to bring their content back on Facebook thanks to all the media attention. For most ‘regular’ users, this is not the case, because Facebook allows complaints to content being removed only in certain circumstances. Beside the implementation of a more efficient mechanism for complaints, Facebook is also requested to publish reports on transparency and to include details such as the type of removed content, the initiation of moderation actions and the number of wrong moderation decisions regarding the content removal.

“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with. Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook,” said Nate Cardozo from  Electronic Frontier Foundation (EFF).

Related content

What are the provisions of new policing draft laws

The analysis of the new Draft Law on Internal Affairs, the Draft Law on Data Processing and Records in Internal Affairs, as well as the new working draft of Data Protection Impact Assessment, shows that the fundamental issues of the intrusive technology application in Serbia have not been addressed. Specifically, the Draft Law on Internal Affairs enables mass, […]

Facebook is starting to follow electoral and political advertising in the Balkans

Facebook has announced that it will expand its transparency system and confirmation of authenticity of ads about elections and politics starting from mid-March. Namely, Facebook will cover additional 32 countries, including Serbia and North Macedonia where the elections are to take place very soon. This turn of events follows the efforts of SHARE Foundation and its international partners to […]

Cellebrite halts use of its forensic tool in Serbia

UPDATE 28 February 2025: Amnesty International’s Security Lab found one more case of abuse of Cellebrite’s tool on a phone of a student activist, who was held on 25 December after attempting to attend the SNS rally in Sava centar. More information and technical findings available at: https://securitylab.amnesty.org/latest/2025/02/cellebrite-zero-day-exploit-used-to-target-phone-of-serbian-student-activist/ The digital forensics tool is withdrawn from […]