Protecting User Rights Online: The Impact of Regulation and the Role of Non-State Actors
The following text summarizes the research into the experiences of civil society organizations, researchers, activists, and media from Bosnia and Herzegovina, Montenegro, and Serbia, with a special focus on the spread of harmful and illegal content online. The findings indicate that the existing policies of major online platforms in the region are applied inconsistently, while users are left without effective appeal and protection mechanisms. Detailed analyses of specific experiences, derived from a series of interviews with key actors, can be found at the following links: 1, 2, 3, 4, 5, 6, 7.

Photo: “Zašto ne”
Introduction
Large online platforms, especially social media networks, have become central actors in the shaping of public space and one of the main sources of information for citizens. They have long assumed a primary role in the distribution and consumption of content, while media and content creators increasingly depend on platforms to reach their audiences. Although such development brings new opportunities for communication and exchange of information, at the same time it opens space for the spread of illegal and harmful content, including disinformation, hate speech and other forms of violence.
In the Western Balkan countries, including Bosnia and Herzegovina, large online platforms are not subject to legal obligations that would ensure the preservation of the integrity of the online sphere, as well as transparency and responsibility for the harmful effects of platforms’ actions.
By adopting the Digital Service Act (DSA), the European Union established a regulatory framework that seeks to respond to these challenges, emphasizing the important role of non-state actors, especially civil society organizations, in protecting fundamental rights and the most vulnerable groups.
Along with advocating for the establishment of an appropriate legal framework modeled on the DSA, it is necessary to systematically work on strengthening the capacity for its future application in the region. This implies familiarizing a much wider circle of actors with the existing framework for the regulation of online platforms, its goals, the obligations it imposes on the platforms and the rights it guarantees to users.
It is especially important to improve the understanding of the specific mechanisms that this regulatory framework provides, as well as the way in which it can respond to problems that are recognized in practice, such as the spread of disinformation, non-transparent content moderation or threats to fundamental rights. In this context, it is crucial to clearly define and strengthen the role of civil society organizations, researchers and the academic community, and prepare them in a timely manner for future responsibilities – from monitoring, research and evidence gathering, to active participation in the implementation of the regulatory framework. In this way, the regulation of online platforms in the region will not be reduced to the formal adoption of regulations, but will result in a real and sustainable improvement of the integrity of the online space.
Experiences with online platforms in the region
In the period from September to December 2025, the Citizens’ Association “Why not” (“Zašto ne”) conducted 17 interviews with various actors from Bosnia and Herzegovina, Montenegro and Serbia – representatives of civil society organisations, the academic community, activists who are exposed to harmful content or represent the interests of groups whose rights are often violated in the online space. Their experiences are presented through a series of articles published on the Association’s website www.zastone.ba. The articles point to the most common forms of harmful content, rights violations and abuses in the online space that affect and are encountered by vulnerable and marginalized categories of society: children, women and members of the LGBTIQ+ community. They also address the responsibility of online platforms for enabling such abuses, but also for spreading content that can have broader social consequences, such as disinformation. Finally, these analyses shed light on the non-transparent practices of the platforms, such as those that negatively affect the availability and visibility of media content, or prevent insight into the data needed for research on the platforms themselves and their impact on society.
Digital technologies play a major role in strengthening and spreading gender-based violence. In the region, cases of sharing intimate photos or personal information of women without their consent, sending unwanted messages and sexual harassment, spreading hate speech, but also open threats of violence have all been noticed. Content creators who deal with feminist topics often face online abuse and harassment, mostly through misogynistic and aggressive comments. Complaints to the platforms about these violations – which at the same time represent violations of the platforms’ own policies and rules – in most cases remain unanswered. The lack of protection mechanisms on the platforms, combined with inadequate legal protection, result in women in the countries of the Western Balkans feeling powerless and insecure in the online space.
Various forms of online violence, discrimination, hate speech, insults and threats are faced on a daily basis by the LGBTIQ+ community. Malicious actors often abuse online platforms for coordinated attacks that threaten the security or intimidate members of this community, but also hinder the activities of activists and organizations that promote their rights. The policies of online platforms declaratively do not allow abuse and hate speech based on sexual orientation, but the experiences of members of the LGBTIQ+ community in these areas show that the platforms’ reactions to reports are either completely non-existent, or they receive generic responses to reports that the content does not violate their policies.
Through online platforms, the welfare and safety of children and minors is endangered in various ways. The platforms are, however, much more up-to-date when it comes to recognizing and removing sexually explicit material. In Bosnia and Herzegovina, for example, there are direct channels of communication with law enforcement services for reports of such content, as well as cases of sexual exploitation and abuse of children. On the other hand, when it comes to content that does not have elements of a criminal offense, communication with platforms in the countries of the region is inadequate and their reaction is absent. Children are increasingly exposed to online violence on platforms, including peer violence, as well as various types of harmful and inappropriate content: from explicit pornographic content to hate speech. Experts from the region also agree that the digital literacy of parents and legal guardians is generally at a low level, which further contributes to the fact that children remain unprotected from the serious risks they face in the online world.
The nature and increasingly significant role of online platforms in receiving and disseminating information, as well as their design and way of functioning, make them an ideal ground for spreading disinformation. As in the rest of the world, disinformation campaigns have real negative consequences for the citizens of the region, exposing them to health risks, such as reduced vaccination rates, or financial ones, through scams that can lead to the loss of money or important data. In addition, the spread of disinformation seriously threatens democratic processes and the protection of fundamental rights. Numerous campaigns of hate speech and election manipulation are based precisely on disinformation. Although large online platforms declaratively apply measures to suppress disinformation in their policies, they nevertheless significantly contribute to its spread, thanks, among other things, to algorithms that favor the amplification of disinformation content, as well as business practices such as content monetization.
The non-transparent practices of online platforms increasingly affect the visibility and availability of content published by users and the media. Reducing visibility, content removal or lack of ability to publish content, and even entire accounts being blocked on social media networks, are frequent experiences from the region, with special emphasis on the lack of feedback on the reasons for such moves by the platforms. In addition to the lack of transparency, the problem is also the absence of functional communication channels and mechanisms for filing appeals against the platform’s decisions on content moderation. Media outlets, especially independent and local ones, to a large extent depend on platforms to reach audiences and readers. At the same time, they do not have any control over the reach of their content, considering that it is the platforms that, through their algorithms, determine which news or content will be visible. Experiences of the media from the region bear witness to the constant decrease in the organic reach and visibility of content.
Research conducted by the academic community and civil society is key to understanding the functioning of online platforms and how they are misused and contribute to the violation of users’ rights. However, the platforms are less and less open to cooperation in providing access to data for research purposes, imposing disproportionate conditions and costs of tools or software that can be used to access such data. All this represents a special problem in smaller regions and speaking areas, such as the Western Balkans, where domestic scientific and research institutions are unable to allocate funds for research, or do not recognize the importance of research in this area. This, consequently, leads to a lack of research on some aspects of the operation of large online platforms, and without adequate research, the evidence of their harm remains anecdotal and cannot be considered relevant for investigations or for making any final conclusions about the harmful effects of the platforms.
The need for regulation of online platforms
Large online platforms generally have declarative policies and community guidelines that should protect certain categories of users and prevent the spread of illegal and harmful content. However, the experiences of the actors “Zašto ne” interviewed during the conducted analysis clearly show that in the area of the Western Balkans, the processes and systems of content moderation are not effective. One of the reasons for this is that the platforms do not invest enough in human resources, with the moderation being mainly based on algorithmic recognition of problematic elements that are not precise enough and do not take language or context into account. On the other hand, automated moderation often leads to the unjustified removal of content, such as some media articles, which has a negative impact on freedom of expression.
In both cases, the only protection mechanism left for users is to report violations to the platforms themselves – a mechanism that in practice does not have much effect because the reported content usually remains on the platforms, and complaints about unfairly removed content remain unanswered and without explanation from the platforms. The absence of a legal obligation for online platforms to establish official channels of communication and cooperation in the countries of the region results in the creation of one-way communication, which often ends in automatic, standard responses.
The consequences of the lack of regulation in this area become greater and more significant if we take into account the global trend of abandoning or softening the existing policies of large online platforms, such as the Meta’s decision from the beginning of 2025 to remove numerous restrictions, including the previous ban on making claims about mental illness or “abnormality” based on gender or sexual identity. Other major platforms are starting to follow this practice.
When it comes to fighting disinformation, one of the few transparent practices in the region is Meta’s independent fact-checking program. This program, which Meta implements in partnership with organizations in all three analyzed countries of the region (BiH, Montenegro and Serbia (1, 2)), has a positive impact on the integrity of the information environment.
However, at the beginning of 2025, this program was abolished in the United States of America, and there is a tendency for other large platforms to withdraw the measures established in order to suppress disinformation.
Even the measures that are declaratively still in force are not applied consistently or even at all in the region. A case study conducted by “Zašto ne” in cooperation with the organization “What to Fix” showed that Meta in Bosnia and Herzegovina enables the monetization of fake news sources, which is in direct contradiction to the platform’s own rules. According to the official rules, monetization of content is only allowed to those content creators who adhere to the policies and community standards, which includes the prohibition of publishing disinformation.
Therefore, the space in which the platforms, by applying their own rules, at least somewhat mitigated the harmful effects of certain content and practices, began to narrow further. The existence of a legal framework that will impose clear obligations and responsibility on platforms is a prerequisite for creating a safer online space in which human rights and freedoms will be protected.
European approach to online platform regulation
The Digital Services Act (DSA) is primarily concerned with the dissemination of illegal content on platforms and their non-transparent, arbitrary moderation, seeking to address both failures to remove unlawful material and excessive removal of content.
When it comes to illegal content, every online platform that provides its services on the territory of the European Union is under the obligation to secure:
- Cooperation, legal representative, contact point and functional channels of communication with competent institutions. Stakeholders from the region often cite the lack of direct communication with the platforms as one of the main reasons for the absence of an adequate response to the most common problems faced by citizens in the online space.
- The possibility for every user to notify illegal content in a straightforward way. Platforms are then obliged to respond quickly to such notices, as well as to provide feedback and the possibility of appeal to dissatisfied users. Currently, users in the region can only report content that violates platforms’ own rules and community standards. This functionality would open up the possibility of reporting cases of violence, abuse, threats or hate speech – not based on violations of platform policies, but on domestic laws, to which platforms are obliged to respond. In the event of a lack of response from the platforms, there would be a possibility to appeal to the competent authorities, which is not currently the case.
- Cooperation with trusted flaggers, organizations that have the expertise and competence to detect and identify illegal content. In accordance with the DSA, the status of trusted flagger is given by the digital service coordinators (competent national authorities responsible for the implementation and supervision of the application of the DSA), which obliges the platforms to provide these organizations with special channels for reporting content and to act on their notices as a matter of priority. The existence of such a mechanism in the domestic legislation of the countries of the region would enable organizations dealing with the protection of certain categories of users, e.g. organizations for the protection of children, women’s or LGBTIQ+ organizations, to have a direct mechanism to protect their users also in the online world.
Furthermore, the EU legal framework imposes on platforms the obligation to be more transparent about their content moderation practices:
- Any measure that results in the removal, disabling of access or reduction of the visibility of content, or the suspension of the user’s account, must be explained via statements of reasons provided to affected users. These statements are publicly available in the DSA Transparency Database.
- Platforms are also obliged to provide users with easily accessible appeal mechanisms for content moderation decisions. The introduction of these obligations would provide the domestic media and content creators with numerous answers that they can currently only speculate about, as well as with more predictability, protection and visibility than they currently have. All this would have a more positive effect on media pluralism and freedom of information in the Western Balkans.
Platforms’ responsibility for how their own systems contribute to online risks is regulated, among other things, through their obligation to:
- Assess the risks that their design, algorithmic systems, policies, terms and conditions, or the way they are being used, represent for the spread and amplification of illegal content, violations of the fundamental rights of users and risks to their well-being and security, as well as the spread of content that may negatively affect electoral processes or public health.
- Apply measures to mitigate those risks, for example by improving their content moderation practices in local languages, adjusting algorithmic systems that favor the amplification of harmful content, applying measures to protect children from harmful and inappropriate content, or, in terms of disincentivizing disinformation, applying measures of demonetization and other measures provided for in the Code of Conduct on Disinformation.
The DSA also imposes obligations regarding the transparency of platforms, which are obliged to provide access to data for the purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the EU, and the assessment of the adequacy and effectiveness of the measures taken to mitigate those risks. Access to publicly available data is not limited to academic or research institutions, and may also be requested by non-profit organizations.
The European regulatory framework – at least in the form in which it is intended to be applied in the European Union – therefore offers systemic and proportional solutions for abuses and violations that are recognized as key and most prevalent in the countries of the region.
However, significant challenges remain in adopting and implementing similar legal frameworks locally, ranging from ensuring that regulation of online platforms does not infringe upon fundamental rights and freedoms, to addressing institutional challenges.
Moreover, creating the necessary space and capacity for meaningful participation by civil society actors and the academic/research community, whose role in platform regulation is essential, will be particularly demanding.
The role of civil society, the academic community and researchers in the regulation of online platforms
Civil society organizations play a crucial and multifaceted role in the implementation of the Digital Services Act in the European Union, including:
- representing the interests and rights of users, particularly vulnerable groups, and drawing attention to violations of their rights online;
- reporting illegal content to platforms, as trusted flaggers;
- researching and documenting harmful, non-transparent or manipulative practices on platforms; • supporting strategic litigation and providing legal assistance to users;
- developing guidelines, codes of conduct and best practices.
Several European countries have established dedicated cooperation mechanisms between competent authorities and other stakeholders. For example, in Germany, support for the application of the DSA is provided to the digital service coordinator by the Advisory Council – an independent expert body comprising representatives from civil society, the academic community, professional associations, and industry. Regular consultations with this body offer authorities insight into current trends on platforms, research findings, and regulatory priorities in the public interest. In Slovakia, although the cooperation mechanism is not as formalized as in Germany, the digital service coordinator regularly consults with civil society organizations, the academic community, and experts associated with the platform for the promotion of media literacy. It also collaborated with research organizations to collect evidence on issues such as the role of online platforms in election manipulations.
The countries of the region also recognize the need for the participation of various actors in the application of domestic frameworks for the regulation of online platforms. In this context, the role of the civil sector will be particularly significant, especially given the need for civil society organizations to be actively involved in the process of harmonization with European legal and regulatory frameworks.
However, based on the conversations that “Zašto ne” conducted with interviewees, as well as the consultative meetings held in Sarajevo, Podgorica, and Belgrade, it can be concluded that civil society in the region is still not sufficiently prepared to engage actively in these processes.
First and foremost, the level of knowledge and understanding of these issues remains very low. This is largely due to the limited capacities of the civil sector which, according to numerous organizations working in the field, are currently focused primarily on the provision of essential services—most often addressing violations of users’ rights in the offline context. There are simply not enough organizations capable of assuming the role of future trusted flaggers, and even those that possess the necessary expertise generally lack adequate capacity, despite a clear interest in participation.
Similarly, a fundamental understanding of “digital” issues remains insufficiently developed within the research community. Rigorous research and the collection of robust, scientifically grounded evidence are essential for understanding how online platforms—through mechanisms such as algorithm design—contribute to risks including the spread of disinformation.
However, universities in the region have yet to fully recognize the importance of such research. Complicated administrative procedures, underdeveloped inter-university and inter-sector cooperation, and limited technical and financial capacities have created an unencouraging environment for academic researchers, who often lack both the motivation and the opportunity to engage with these issues.
Recommendations
1. Competent domestic institutions and decision-makers should:
- ensure the continuous participation of civil society organizations and the expert community in the process of harmonizing domestic legislation with the Digital Services Act (DSA);
- formalize cooperation with a broader range of relevant actors, including civil society organizations and representatives of the academic and media communities, by establishing a permanent advisory forum or a similar mechanism to provide expert support to competent institutions;
- with the technical support of the European Commission, develop a comprehensive strategy that offers a systemic approach to empowering actors involved in the implementation of the regulatory framework, strengthening their capacities, and ensuring their long-term sustainability, including organizations that could assume the role of trusted flaggers;
- initiate further discussions and develop sustainable financing models to ensure long-term, stable, and predictable support for the participation of relevant actors in the implementation of the regulatory framework; in this context, consider opportunities available to candidate countries through EU instruments, particularly within the framework of the European Democracy Shield initiative;
- through regional cooperation mechanisms, establish joint platforms for knowledge exchange, experience sharing, and capacity development, with the aim of improving the consistent and effective application of digital policies across the region.
2. Civil society should:
- actively raise awareness among the public and other civil society organizations – particularly those operating at the local level – about the importance and principles of online platform regulation and its impact on user rights, the media environment, and democratic processes;
- advocate for the harmonization of national legislative frameworks with the DSA, while insisting on transparency, the protection of fundamental rights, and the meaningful involvement of civil society in policymaking and implementation processes;
- strengthen cooperation between civil society and the academic community by developing joint projects and research initiatives;
- proactively seek and develop partnerships— including in cooperation with the academic community—with organizations, networks, and experts from the region and the European Union, with the aim of enhancing professional capacities, improving access to knowledge, and ensuring participation in relevant European initiatives and projects.
3. Academic institutions should:
- integrate topics related to the regulation of online platforms into academic curricula;
- work systematically to raise awareness within the academic community of the importance of research on online platforms, the need for cooperation, and the available mechanisms for data access;
- strengthen cooperation between the academic community and civil society organizations through joint research projects, and encourage and support interdisciplinary research aimed at analyzing systemic risks such as disinformation, algorithmic amplification, content moderation, and impacts on fundamental rights;
- strengthen regional partnerships and encourage joint research initiatives.
(Marija Ćosić and Maida Ćulahović, “Zašto ne”)
This publication was funded by the European Union. Its contents are the sole responsibility of “Zašto ne” and do not necessarily reflect the views of the European Union.
