Gender-based digital violence in Serbia: a review of trends

Digital violence against girls and women in Serbia is steadily on the rise, as increasingly disturbing headlines in the media attest. Among the most frequent targets of abusers are female journalists, social activists, politicians, and other prominent women in the public sphere. Additionally, well-known women often experience more extreme forms of this violence.

Weak legal protection mechanisms, coupled with a lack of systematic education and awareness about this type of violence and its consequences, remain key obstacles in the fight for a fairer digital space.

Drawing on existing analyses and initiatives, primarily from civil society organizations, the SHARE Foundation has prepared a concise overview of key facts concerning the phenomenon of gender-based digital violence. In the introductory glossary, various forms of online violence are described and defined, followed by examples from the experiences of girls and women in Serbia – from so-called revenge porn, which particularly shocked the public due to the age of some perpetrators or its widespread presence on social media, to incidents that represent direct threats to fundamental societal values such as equality and freedom of speech.

In the section on responses and challenges in the fight to combat this type of violence, existing legal protection mechanisms are described, along with how they can be utilized. Additionally, current initiatives for amending laws are presented, particularly concerning the criminalization of the misuse of sexually explicit material, including those created with the help of artificial intelligence.

Finally, concrete steps and their targeted outcomes are outlined in the recommendations section, intended for legislative bodies, the police, judicial authorities, schools, and psychosocial services, as well as for civil society organizations and the wider public—from everyday internet users to influencers who often have a significant impact on them.

The full document “Gender-Based Digital Violence in Serbia: A Review of Trends” is freely available for download.



Read more:

Summer School: Challenges of regulating the digital environment

The third Digital Rights Summer School was held from 25 to 31 August in Perast, Montenegro, gathering more than 40 journalists, activists and experts in the fields of AI, media, human rights, surveillance, and digital policy.

During five days of talks and workshops, numerous issues regarding the regulation of the digital environment were discussed and analysed. New EU regulations, i.e. the Digital Services Act, the Digital Markets Act and the Artificial Intelligence Act, and their implications for the Western Balkans were given particular attention. In addition, topics such as collective redress of digital rights, links between social justice and technology, counter-surveillance techniques and non-consensual intimate image abuse were also discussed.

Many thanks to all participants, lecturers, and guests for the exciting week we spent together! The program was organised by SHARE Foundation, European Digital Rights (EDRi) and Digital Freedom Fund.

We owe our gratitude for the support of the Summer School to the Gieskes-Strijbis Fonds, the Open Society Foundations Western Balkans and the Balkan Trust for Democracy. The Summer School was also supported by a core grant from the regional project SMART Balkans – Civil Society for a Connected Western Balkans, implemented by the Center for Civil Society Promotion (CPCD) in Bosnia and Herzegovina, the Center for Research and Policy Making (CRPM) in North Macedonia, and the Institute for Democracy and Mediation (IDM) in Albania, financially supported by the Ministry of Foreign Affairs of the Kingdom of Norway.


Read more:

Digital ecosystem of the Western Balkans: from regulatory gap to systemic approach

Alongside the innovations, technological breakthroughs and shifts in the digital industry, the third decade of the 21st century also marks a kind of institutional milestone primarily recognized through the establishment of the European Union’s Digital Single Market, accompanied by a set of rules aimed at ensuring that all rights and freedoms guaranteed to citizens by the European legal order are respected in such a market.

Brussels aims to achieve this ambitious goal by pivoting towards a comprehensive systemic regulatory approach, which includes a series of acts of particular significance such as the Digital Services Act (DSA), the Digital Markets Act (DMA), and the Artificial Intelligence Act (AIA). These regulations address key structural challenges of the current digital ecosystem and create a significant part of the normative and institutional framework of the European Digital Single Market.

The question is whether the potential “Brussels effect” in the form of increased transparency of large online platforms, an interoperable digital market, or the mitigation of adverse effects from the use of AI systems in the EU could also be felt in its European neighborhood. How much can this milestone be an opportunity for the Western Balkans region to ride the wave of the EU’s regulatory actions and adopt a European normative direction with a systemic and comprehensive approach to regulating the digital industry? Answers to these questions, among other things, depend on the starting point, i.e., the current legislation and the overall normative and institutional framework in these countries.

As shown by a study conducted by the SHARE Foundation in cooperation with regional partners, existing legislation in Albania, Bosnia and Herzegovina, Montenegro, Kosovo, North Macedonia, and Serbia does not provide an adequate regulatory response to the challenges of the digital environment.

The comparative assessment of regulations in the Western Balkan countries relative to the EU has revealed that certain rules applicable to digital services and markets exist within national legislation. However, most countries lack comprehensive regulations concerning artificial intelligence. Rules that do exist, however, generally fall short of European standards for several reasons. They often stem from previous generations of regulations, failing to adequately address current challenges. Additionally, these rules have limited regulatory strength because they are scattered across a broad spectrum of laws, strategies, decisions, bylaws, and other legal documents, impeding the systematic regulation of the digital environment.

Mapping and assessing existing regulations from the perspective of key digital challenges in the Western Balkan countries provides a crucial baseline for understanding new European rules and the current circumstances in the region, that will also support further research initiatives, monitoring, policy creation, and improvement in these areas. The study analyzed the normative base of relevant acts, mapped new rules, and examined the institutional frameworks of the DSA, DMA, and AIA. These served as benchmarks for comparing the regulatory state in individual Western Balkan countries. Methodologically, the exploratory phase of the research established the value structure of the three Acts, which formed the analytical matrix for the comparative analysis and assessment of these countries’ legal frameworks in relation to European standards.

According to the study’s findings, the central value of the DSA is accountability, while the overall value structure of this act is built on reliability, transparency, safety, horizontality, and accessibility. The key value of the DMA is the democratization of the market, operationalized through transparency, accountability, interoperability, mobility, and the demonopolisation/deconcentration of the digital market. Preventing and addressing the adverse effects of AI was identified as the central value pillar of the AIA, supported by transparency, as well as harm prevention and reduction, and oversight of the use of AI systems.

The significance of these values for every citizen and society as a whole is unquestionable. However, the full implementation of the rules established by these acts is necessary to determine the achievability of the benefits and the attainability of the goals set by European policymakers. Therefore, monitoring the implementation and effects of the new regulations is particularly important. The active involvement of all stakeholders—including European institutions, national bodies, experts, researchers, the industry, and the citizens themselves—is essential for transitioning towards a democratic digital ecosystem that fully respects the rights and freedoms of all citizens.

The Western Balkan countries face the task of comprehensively improving existing regulations and paving the appropriate institutional path for systematically addressing the challenges of the digital environment, opening the digital market towards the EU while respecting the criteria of transparency, accountability, fair competition, and full protection of human rights.



Snežana Bajčeta is a SHARE Foundation Researcher in the areas of digital technology, media i journalism.

Read more:

Non-consensual creation, processing, and distribution of intimate images in the Western Balkans

Legislative and institutional framework


Most of the countries covered in the study have still not directly regulated the non-consensual creation, processing and distribution of images and videos, sometimes colloquially referred to as revenge pornography,1 in their Criminal Codes, except Croatia and Slovenia (both members of the EU) and Montenegro as of the end of 2023. However, each country has a set of criminal offences that can be evoked in these cases. These offences broadly fall under two categories: 1) protection of body and sexual integrity (e.g. sexual harassment, harassment, stalking) and 2) protection against privacy intrusions through technological means (e.g. unauthorised recording/filming, disclosure of personal data). Despite limitations, these offences can provide protections from a normative perspective, though it is difficult to assess if existing offences offer meaningful protection in practice.

Most recently, Montenegro criminalised the unauthorised taking of photographs as well as the unauthorised publication and presentation of other people’s writings, portraits, and recordings through two articles in the Criminal Code. This change presents a significant step forward in the ways in which sexually explicit non-consensual content is regarded and addressed. The criminal acts carry with them a potential prison sentence of between six months and two years. In Serbia the criminal acts of sexual harassment and unauthorised recording are prosecuted based on private criminal charges, meaning that there is no assistance from the police/prosecutor (the affected party is responsible for proving the crime occurred) and more importantly the charges need to be filed in the period of up to 3 months after the person finds out about the content. In all countries, except Serbia, these cases are prosecuted by the General Prosecution office, and there are dedicated police units for countering cybercrime. It is unclear if these units also investigate cases of distribution of non-consensual intimate materials or, more importantly, offer assistance in enforcement procedures (e.g. requests for removal of the content). In some countries, like Serbia and Albania, there are special police units dedicated to counter domestic violence that have a certain level of expertise in dealing with gender-based violence (GBV) cases. 

The enforcement of judgments is tied to severe problems as the digital platforms are reluctant to cooperate with the state authorities. CERT (Computer Emergency Response Teams) and similar bodies are often involved in these procedures (e.g. request to block/remove content from digital platforms) with different levels of success and expertise. In general, there is little to no intervention and assistance from the human rights oversight bodies like the Gender Equality Commissions or Ombudspersons who are still not particularly vested in this societal issue.

Main challenges in the region


  • Most countries lack institutional competence in official bodies such as police and judiciary for dealing with gender-based online violence (GBOV) cases, which often leads to lack of sensitivity or sense of urgency when such cases are reported.
  • Even in cases where the law tries to catch up with the technology and non-consensual processing and distribution of intimate images could be punished, the creation of such materials is still not recognized as a crime, allowing the perpetrators to get off and not face punishment (in the case of Telegram groups in Serbia, the administrator of the groups was the only one persecuted and ultimately released).  
  • Insufficient knowledge on issues of gender-based online violence is prevalent and therefore these issues are not problematised adequately, affecting all aspects of society, from public authority figures and education institutions to the broader public.
  • When these kinds of cases are reported on in the media, they are mostly reported on in a sensationalist way and can often retraumatise survivors through unethical reproduction of the content, especially if the case involves celebrities or other public figures.
  • In most countries, educational institutions do not discuss issues such as sexual harassment or gender-based violence, neither in the offline or online context, leaving students without any adequate knowledge on the topic.
  • The nomenclature surrounding these crimes is also problematic, associating the crimes both with revenge (in some instances such cases do not necessarily feature an element of revenge, and this kind of wording might suggest the survivors have done something that warrants retaliation) and pornography (the majority of people with an experience of image based sexual harassment or violence did not consent to the material being made and/or distributed), thus contributing to further stigmatization.
  • The issue of gender-based online violence is often seen as an endemic one, rather than a greater societal issue stemming from cultural approaches to violence against women, sexual harassment, and gender inequality.
  • Civil society organisations and women’s rights organisations working on these topics are usually underfunded and therefore rely on short-term projects and grants to offer their support to survivors and are seldom invited to contribute in official discussions.
  • The stigmatisation of survivors leads to the underreporting of such crimes, which makes it more difficult to understand the widespread nature of such crimes as well as to come up with effective ways to curb them.

What can be done?


In Slovenia and Croatia the Criminal Code recognises the distribution of intimate materials without consent, and moreover, the Croatian Criminal Code addresses more specifically deepfakes and other forms of image-based sexual abuse. A widespread distribution of non-consensual content also carries a higher penalty in these countries. In some countries (North Macedonia) policy changes are taking place that should ensure compatibility of the Criminal Codes with the Istanbul Convention, while in others specific amendments to the Criminal Code are being proposed to deal with such crimes directly (Montenegro). This is an important moment to open up discussion with the legislative bodies about issues of non-consensual image based abuse and ensure that it is taken into consideration when drafting future laws. Although it might not always be necessary to include a direct provision, as in the case of Slovenia or Croatia, it is important to ensure that non-consensual image based sexual abuse is broadly covered and included in state legislature or that certain problematic legal requirements are amended (e.g. in the cases of Montenegro and Serbia – private criminal charges and extremely short three months preclusive period). Additionally, civil society organisations, including those offering shelters, legal support, and awareness campaigns, play a vital role in providing comprehensive support and raising awareness about these critical issues.

Regardless of the policy changes, it is of utmost importance to undertake in-depth case law research to understand the previous court rulings and legal reasoning in digital GBV cases before proposing any legislative changes. This research could in fact show that Criminal Codes and case law are sufficiently equipped to offer protection, in which case the advocacy efforts should focus on strategic litigation, raising awareness, and sensitisation of judicial authorities and law enforcement. 

Concerning good practices, it can be said that, for example Albania has a geographically balanced spread of women-led groups and resources, while some of the women-led initiatives in Bosnia and Herzegovina, e.g. the women of Kruščice, have shown that community activism can lead to significant changes – which can motivate communities to work on such issues too. In Greece there were certain cases (e.g. S. Panagiotopoulos case) that raised social awareness on non-consensual image based sexual abuse and represented a stepping stone for survivors to seek justice (as recently happened in a case in Thessaloniki). The penalties accompanying these cases are too lenient to serve as adequate deterrence, but serve as a yardstick for similar cases. Similar to Serbia, survivors in Greece are encouraged to reach out to CSO support structures, before engagement with police, so that trained lawyers can provide trauma support early, and throughout the process. 

In Montenegro, the Women’s Rights Centre organises trainings for public servants who work on cases of non-consensual sexual materials creation and distribution and there is news of upcoming legislative changes that would allow state-wide filtering and blocking of online content, including the distribution of non-consensual image based sexual abuse materials. NGOs in North Macedonia are working under the Platform for Gender Equality lobby to change not just the legal framework, but also the public narratives (e.g. victim blaming) in non-consensual image based sexual abuse cases through public protest and demonstrations,2 as well as provide survivors with free legal aid and psychological support. 

In March 2022, two people were convicted on charges of production and distribution of child pornography in North Macedonia. Kosovo’s 2010 Law against domestic violence is currently going through a process of amendment, and the new focus will be set on countering violence against women online. This could create momentum to more effectively counter GBV in the country. Serbia’s Strategy for Prevention and Combating Gender-Based Violence Against Women and Domestic Violence (2021-2025) recognises ‘revenge porn’ as a form of GBV requiring more attention and awareness-raising. Existing experience, and decades of learning and developing alternative systems to counter GBV and support women have the potential to bring change, raise awareness, and offer immediate help to survivors. 

Joint efforts on educating the public and relevant stakeholders in the region on these issues is also a necessary advocacy step. Demystifying concepts which surround GBOV and adequately approaching them can open avenues for understanding and collaboration between civil society and other groups such as governments, the media, and the private sector. Arguing for improved knowledge on this topic will be beneficial for shaping future generations and underlining the importance of strong and appropriate regulatory mechanisms and frameworks. 



1 This term can be misleading and insulting, as it both refers to revenge as to imply there is something to take vengeance for, and paints the materials as pornography, which is not the case in the majority of instances. A more appropriate way to refer to it is as non-consensual image based sexual abuse.

2 In February 2021 hundreds of protesters gathered outside North Macedonia’s Interior Ministry called on the government to crack down on private messaging groups sharing unauthorised and often explicit photographs and videos of women and girls such as the Telegram group Public room with more than 7,000 members shared thousands of private photos and videos of women and girls, including their private data such as addresses, phone numbers, ID, etc. (doxxing).


Mila Bajić is the Lead Researcher at SHARE Foundation with a focus on the relationship between new media, technology and privacy.

Non-consensual creation, processing, and distribution of intimate images in the Western Balkans is partly supported by the Open Society Foundation Western Balkans.

Read more:

Apply for the 2024 Digital Rights Summer School!

The applications for the 2024 Digital Rights Summer School in Perast, Montenegro are now open!

The School takes place from 25 to 31 August 2024, and this year’s program is designed for enthusiasts based in Southeast Europe, who are passionate about digital rights and eager to learn more about the latest developments in this field.

During the school, we’ll explore the impact of new technologies on human rights through lectures and talks by regional and international digital rights experts. You’ll take part in workshops and discussions on digital markets and online content regulation, information warfare, online harassment, AI-enabled surveillance and policing, while exploring their potential effects on regional affairs and strategies for advocacy. As a participant, you will also have the opportunity to connect, share experiences and collaborate on digital rights initiatives across the region. These are just some of the topics you will hear about and gain both theoretical and practical knowledge!

This program is organised by SHARE Foundation, European Digital Rights (EDRi) and Digital Freedom Fund – it offers a unique opportunity to learn from leading experts and network with other professionals in the field. Join us for an engaging program in the beautiful setting of Perast. Accepted applicants will be provided with travel and accommodation during their stay.

To apply, please fill out the form at the following link:
https://crm.shareconference.net/drss2024application 

Applications form closes on 15 May, 17:00 CEST (Belgrade time). Decisions of acceptance will be made in early June. 

Please feel free to reach out at [email protected] if you have any questions.



Read more:

Elections on the information margin

Preliminary analysis of media content for the most visited online media in Serbia: 1 November – 17 December 2023 

On Wednesday November 1, the President of the Serbian Parliament, Vladimir Orlić, announced local elections in 65 cities and municipalities in Serbia, including Belgrade. A few hours after Orlić’s traditional protocol of signing the Decision on calling for elections in the hall of the National Assembly, the message “Long live Serbia! Happy elections!” arrived from the digital address of the President of the Republic of Serbia. Aleksandar Vučić’s cyber address also announced snap parliamentary elections, which started the pre-election campaign that lasted until December 14 at midnight. Meanwhile, on November 16, provincial elections were announced.

Research shows that online media’s importance as sources of information keeps steadily increasing, both in the world and at home. Although television is still in the lead when it comes to sources of obtaining information, the role of online media, which is currently in third place, requires special research attention. Taking into account the importance of free, comprehensive and balanced information from the point of view of the level of democracy of the entire election process, this research focuses on digital media and the information environment they create. The aim of the research is to identify and explain the key features of the informational content published by the leading online media in Serbia, in order to determine to what extent and in what way citizens were informed about the elections in the pre-election period, but also about key social and political issues, and the ways in which the central themes were defined in the digital environment and specific forms of support to the ruling structures were shaped.

Elections 2023: Preliminary Report (.pdf)



Mila Bajić is the Lead Researcher at SHARE Foundation with a focus on the relationship between new media, technology and privacy.

Snežana Bajčeta is the SHARE Foundation Researcher in the fields of digital technologies, media and journalism.

Read more:

EU proposal of the AI regulation adopted

Late into the night on Friday, December 8, the lengthy negotiations on the final version of the EU artificial intelligence regulation (AI Act) were concluded, with the first of a dozen technical meetings expected this week to specify the details of the law’s implementation.

According to initial reactions, the adopted solutions did not fully meet the expectations of human rights organizations and activists, nor did they satisfy industry lobbyists and security-focused politicians.

Among other things, it is mentioned that “predictive” systems will only be partially prohibited, meaning that not all applications of artificial intelligence systems in policing and “crime prediction” are classified as unacceptable risks – a significantly weaker protection than what members of the European Parliament voted for this summer. The final ban includes some predictive systems based on “personal traits and characteristics”, but not geographic crime prediction systems already used by police forces across Europe. Critics note that such a partial ban allows for the creation of additional exemptions in the future.

Of particular concern is the possibility that any application of artificial intelligence systems in the context of “national security” would be entirely exempt from the regulation’s scope, including bans on unacceptable risks and transparency requirements.

The most challenging part of the negotiations concerned the bans, that is the classification of AI systems as posing unacceptable risk. The adopted proposal, as reported by those with insight, bans real-time remote biometric identification (RBI) in publicly accessible places—except when used to search for specific suspects or victims of certain crimes, to prevent “specific, substantial, and imminent threats to the life or physical safety of natural persons or a specific, present threat of a terrorist attack,” as well as for the “targeted search for specific victims of abduction, trafficking in human beings and sexual exploitation of human beings as well as search for missing children.”

The use of RBI needs to be approved by a judicial or otherwise independent authority and is limited in time and space, and it cannot include constant comparisons of all people in the public spaces with full police or other databases. In urgent situations, the judicial authorisation has to be done ex-post within 24 hours.

Post-Remote Biometric Identification (not in real-time, but on video footage) is not banned, but now a high-risk category and only possible if there is a prior judicial authorisation, or if it is strictly necessary in an investigation for the targeted search of a person convicted or suspected of having committed a serious criminal offence that already took place. Member States may introduce more restrictive laws on the use of Post-RBI systems.

Biometric categorisation is banned “that categorise natural persons based on their biometric data to deduce their political opinions, trade union membership, religious or philosophical beliefs, sex or sexual orientation from this biometric data.”

Restrictions have been introduced regarding emotion recognition, some applications of high-risk AI systems in the private sector, while criteria for risk classification have been expanded. The publication of the adopted version of the regulation is expected in the coming months.



Read more:

European Promotion of the SHARE Foundation’s Book on Biometric Surveillance

One of the most comprehensive studies on the use of biometric systems worldwide, the SHARE Foundation’s book “Beyond the Face: Biometrics and Society” was presented on Monday, December 4, in Berlin and on Wednesday, December 6, in Brussels.

The promotion brought together the community for the protection of digital rights and freedoms at the Tactical Tech premises in Berlin, where the authors and attendees discussed the legal and social consequences of mass biometric surveillance.

In Brussels, a discussion was organised in the European Parliament on the study’s key findings, with opening remarks given by Members of the European Parliament Viola von Cramon and Sergey Lagodinsky. As an urgent danger, the MEPs pointed to a potential precedent in Serbia with the application of biometric surveillance technology towards the creation of a dystopian surveillance society. Patrick Breyer, also a member of the European Parliament, took part in the discussion.

Coincidentally, the book promotion in Brussels took place during the final stages of the trilogue on the new European regulation on artificial intelligence (AI Act). In fact, after the event MEP Lagodinsky moved on to the negotiations on banning of unacceptably risky systems, carrying his own copy of the book. The impact of the AI Act will have far-reaching consequences for the regulation and use of biometric surveillance systems around the world. Thus the importance of such detailed analysis of the biometric surveillance application in different countries, contesting problematic provisions of the future European law that could legitimise these practices.

The study on the social consequences of mass biometric surveillance provides a detailed description of technologies from various manufacturers used by public and private actors for surveillance. It includes a comparative analysis of regulations governing this field in the US, the EU, and a range of countries in Africa, Asia, and Latin America. Additionally, it explores some practical cases of biometric surveillance application, from Myanmar and the United Kingdom to New York and Belgrade, for border control, public space monitoring in major cities, or suppression of opposition activities.

Spanning over 300 pages, divided into three main segments—Technology, Law, Practice—the book acquaints readers with the current global experience of the conflict between fundamental human rights and the profit-driven biometric surveillance industry. Despite abundant evidence, authorities worldwide still believe that these systems contribute to the security of society.

The book is freely available, currently only in English. The editors of the publication are Ella Jakubowska (EDRi) and Andrej Petrovski & Danilo Krivokapić (SHARE). The authors of the texts are Bojan Perkov (Technology), Jelena Adamović and Duje Kozomara (Law), Mila Bajić and Duje Prkut (Practice).



Read more:

Spyware attack attempts on mobile devices of members of civil society discovered

SHARE Foundation warns of the disastrous impact of misuse of technology against the critical public in Serbia

On October 30, two members of civil society from Belgrade received an alert from Apple that they were potential targets of state-sponsored technical attacks. Thanks to good cooperation with civil society organisations in Serbia, they contacted the SHARE Foundation immediately after receiving the warning and asked to check the allegations to determine if their devices were attacked by any known spyware.

After the SHARE Foundation team, in cooperation with Internews, received confirmation from Apple representatives that the alerts were authentic, mobile devices were analysed to determine whether they had traces of spyware infection, among which the most well-known are Pegasus and Predator. For the final confirmation, the SHARE Foundation team turned to international organisations Access Now and Amnesty International, which have high expertise in the field of digital forensics.

Based on the reviewed data, these two respectable organisations confirmed that traces of an attack attempt that took place on 16 August 2023 were found on both mobile devices. Both expert organisations came to the same findings – that in the initial phase the attack was attempted via a vulnerability in the iPhone’s HomeKit functionality. The Pegasus spyware has previously been linked to multiple exploits targeting HomeKit, including PWNYOURHOME.

The SHARE Foundation warns that spyware attacks on representatives of the critical public have a disastrous impact on democracy and human rights, especially in the pre-election period. The use of spyware is illegal and incompatible with democratic values.

We remind the public that these and similar tools for technical attacks on mobile devices are used by non-democratic regimes around the world to spy on members of the opposition, civil society, independent media, dissidents and other actors working in the public interest. Such activities threaten the freedom of expression and association, as well as the right to privacy and secrecy of communication guaranteed by domestic and international law.

The SHARE Foundation invites media and civil society representatives who may have received the same message from Apple to contact the foundation to verify the warning.

NOTE: In accordance with the wishes of the members of civil society who were the target of the attack, as well as with security measures, SHARE Foundation will not provide additional details about this incident.

NOTE 2: The part of the text related to device analysis and vulnerabilities that were targeted was amended on 29 November 2023 at 12:38 for precision.



Read more: