Accountability of Major Online Platforms: Harmful Content, Weak Moderation, and the Absence of Regulation in the Western Balkans
Although they have become an indispensable part of everyday life, major online platforms in the Western Balkans often fail to respond to hate speech, disinformation, and other harmful and illegal content. A lack of adequate moderation, non-transparent algorithms, and the absence of systemic regulation leave users and communities without protection.

Photo: Zašto ne
For the purposes of this article, interviews were conducted with representatives of civil society organisations, the academic community, media outlets, and activists from Bosnia and Herzegovina, Montenegro, and Serbia. Previously published texts in this series can be found on the website of the Citizens’ Association Zašto ne (1, 2, 3, 4, 5, 6).
Spending time on major online platforms has become an unavoidable feature of everyday life for people around the world. From searching on Google, to exchanging messages on Viber, to sharing photos on Snapchat – major online platforms shape numerous aspects of our lives. And while platforms largely facilitate, and even enable, our daily routines, the content hosted on them can be incredibly harmful, even dangerous.
Disinformation is shared, threats of violence are issued, harassment occurs, hate speech spreads, and even child sexual abuse material and violent content are circulated. The algorithms of certain online platforms not only enable but actively encourage harmful content. The reason for this is clear: the owners of major online platforms have their own interests, which in many cases come down to money. Some of the world’s wealthiest individuals accumulated their capital largely through major online platforms (1, 2).
Experiences from the region show that the rules of major online platforms are frequently violated, while platform enforcement mechanisms are slow or non-existent. On these platforms, the rights of women, children, and the LGBTIQA+ community are violated, disinformation spreads, all while the digital space remains almost entirely outside any form of regulation in the Western Balkans.
Without Response and Accountability
Major online platforms generally have complex sets of policies and community standards that are allegedly meant to be respected when using their services. However, the reality is that in most cases these rules are merely declarative. Their actual enforcement is lacking.
Reports of harmful content that violates platform rules are almost always met with no response. As our interviewees told us, if a response is received at all, it is usually a generic one stating that the content does not violate any platform rules. These are automated responses generated by a “machine”. According to the interviewees, there is a lack of moderators who are familiar with the local context and language. This is especially the case with reports regarding hate speech. Proper analysis of such content requires knowledge of the local context, language, and even slang. Automated, computer-based content review will struggle to “understand” that content constitutes hate speech if it is analysed through computer-generated translation.
Automated moderation mechanisms also appear not to function properly in the Western Balkans. Due to the lack of local moderators, algorithms sometimes remove or flag content as harmful even when that is not the case and vice versa – failing to recognise content that genuinely is harmful.
All of this leads to users increasingly giving up on reporting content to platforms, since they do not trust platforms to respond. The process of reporting and documenting violations of platform rules and jeopardy of one’s rights is often time-consuming and mentally demanding. It drains capacities that human rights defenders and civil society actors frequently lack.
Due to the difficulty of obtaining responses from platforms, users resort to improvised solutions. Within their communities, they call for coordinated reporting of content in the hope of prompting faster platform action. Through personal contacts, they try to reach someone who works for, or knows someone who works for these companies, all in an effort to resolve violations of rules on platforms.
None of this represents a systemic solution, which is the only way to ensure the integrity of the online environment. The adoption of legislation forcing major online platforms to ensure safe use and comply with certain rules is necessary.
In addition, on the Western Balkans, content that violates domestic legal frameworks cannot be reported to major online platforms on that legal basis. For example, denial of the genocide in Srebrenica, which is punishable by imprisonment under the Criminal Code of Bosnia and Herzegovina, cannot be reported to platforms as a violation of that law. Narratives denying the Srebrenica genocide are therefore widely shared on social media, as analyses show (1, 2), while platforms are under no obligation to provide reporting mechanisms for such content.
Lack of Transparency Makes It Difficult to Prove the Problem
An additional layer of the problem is the lack of transparency of major online platforms, both regarding their actions and the data regarding their operations.
As our interviewees shared with us, major online platforms, such as those owned by Meta, sometimes make decisions without providing clear explanations. Content may be removed, its reach reduced, or sharing disabled due to an alleged rule violation, without specifying which rule was breached. This significantly complicates users’ ability to navigate platforms, as they are left to guess what the problem might be and how to avoid it in the future.
Content creators do not reliably know why some of their content achieved greater reach while other content reached far less users. Media outlets face the same issue. The work of both groups often depends on platform decisions, as platforms represent a source of income and, for media outlets in particular, a way for their stories to reach a wider audience.
Another aspect of this problem, as noted, is the lack of platform transparency regarding operational data. Essentially, in order to reliably demonstrate the impact of platforms across various areas of our lives, certain data are required; data that often only the platforms themselves can provide. However, platforms sometimes obstruct this process in order to protect their own interests.
Researchers therefore face barriers to accessing data, including complex verification processes and requirements to use expensive access tools. Platforms simply do not want to make their data easily available, as some research could reflect negatively on them. In-depth, complex studies of platform operations and impact could reveal findings that platforms would prefer to remain unknown.
For example, in an earlier case in Serbia, dating apps refused to provide data on violations of their female users’ rights. Clearly, evidence of widespread endangerment of women’s safety facilitated by such platforms could seriously damage their image and market position. In the Western Balkans, however, there is no mechanism to compel them to release such data.
At the same time, conducting such research would be crucial for advocating improvements in platform operations and for holding platforms accountable. Researchers from the Western Balkans, however, told us that due to a lack of financial and institutional support, they often abandon such research and redirect their work toward other, more accessible areas of research.
Toward Systemic Solutions
Despite all the problematic practices described above, major online platforms remain extremely important, especially for marginalised communities. The online space is often a place of socialisation for those who are vulnerable offline, such as LGBTIQ+ community. Achieving responsible and transparent operations by major online platforms is therefore a shared goal toward which actors from many sectors of our society should strive.
Regulation of online platforms, such as that introduced in the European Union through the Digital Services Act, requires a whole-of-society approach. This approach implies an active role for civil society, the academic community, media, institutions, and the private sector, alongside clear demands for accountability; both from online platforms and from decision-makers responsible for adopting and enforcing appropriate regulatory frameworks. Such a systemic approach cannot be reduced to normative solutions alone; it must also include strengthening the capacities of relevant actors and ensuring their long-term sustainability, enabling them to effectively participate in the implementation and oversight of future legal frameworks.
(Marija Ćosić and Maida Ćulahović, “Zašto ne”)