{"id":13334,"date":"2025-12-23T10:56:47","date_gmt":"2025-12-23T09:56:47","guid":{"rendered":"https:\/\/zastone.ba\/?p=13334"},"modified":"2025-12-23T17:39:41","modified_gmt":"2025-12-23T16:39:41","slug":"accountability-of-major-online-platforms-harmful-content-weak-moderation-and-the-absence-of-regulation-in-the-western-balkans","status":"publish","type":"post","link":"https:\/\/zastone.ba\/en\/accountability-of-major-online-platforms-harmful-content-weak-moderation-and-the-absence-of-regulation-in-the-western-balkans\/","title":{"rendered":"Accountability of Major Online Platforms: Harmful Content, Weak Moderation, and the Absence of Regulation in the Western Balkans"},"content":{"rendered":"<p><span class=\"lead\" style=\"font-weight: 400;\">Although they have become an indispensable part of everyday life, major online platforms in the Western Balkans often fail to respond to hate speech, disinformation, and other harmful and illegal content. A lack of adequate moderation, non-transparent algorithms, and the absence of systemic regulation leave users and communities without protection.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-13327\" src=\"https:\/\/zastone.ba\/app\/uploads\/2025\/12\/odgovornost-velikih-online-platformi-stetni-sadrzaji-slaba-moderacija-i-izostanak-regulacije-na-zapadnom-balkanu-1-768x432.jpg\" alt=\"\" width=\"768\" height=\"432\" srcset=\"https:\/\/zastone.ba\/app\/uploads\/2025\/12\/odgovornost-velikih-online-platformi-stetni-sadrzaji-slaba-moderacija-i-izostanak-regulacije-na-zapadnom-balkanu-1-768x432.jpg 768w, https:\/\/zastone.ba\/app\/uploads\/2025\/12\/odgovornost-velikih-online-platformi-stetni-sadrzaji-slaba-moderacija-i-izostanak-regulacije-na-zapadnom-balkanu-1-1536x864.jpg 1536w, https:\/\/zastone.ba\/app\/uploads\/2025\/12\/odgovornost-velikih-online-platformi-stetni-sadrzaji-slaba-moderacija-i-izostanak-regulacije-na-zapadnom-balkanu-1.jpg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/p>\n<p>Photo: Za\u0161to ne<\/p>\n<p><i><span style=\"font-weight: 400;\">For the purposes of this article, interviews were conducted with representatives of civil society organisations, the academic community, media outlets, and activists from Bosnia and Herzegovina, Montenegro, and Serbia. Previously published texts in this series can be found on the website of the Citizens\u2019 Association Za\u0161to ne (<\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/the-politics-of-profit-and-disinformation-where-is-the-responsibility-of-large-online-platforms\/\"><i><span style=\"font-weight: 400;\">1<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">, <\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/online-violence-are-major-platforms-doing-enough-to-protect-women\/\"><i><span style=\"font-weight: 400;\">2<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">, <\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/child-safety-online-are-major-platforms-doing-enough\/\"><i><span style=\"font-weight: 400;\">3<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">, <\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/data-concealment-and-lack-of-transparency-do-online-platforms-enable-research-into-their-operations\/\"><i><span style=\"font-weight: 400;\">4<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">, <\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/invisibility-on-social-media-how-dependent-are-media-and-content-creators-on-online-platforms\/\"><i><span style=\"font-weight: 400;\">5<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">, <\/span><\/i><a href=\"https:\/\/zastone.ba\/en\/declarative-protection-real-violence-what-is-the-experience-of-the-lgbtiq-community-on-major-online-platforms\/\"><i><span style=\"font-weight: 400;\">6<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">).<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">Spending time on major online platforms has become an unavoidable feature of everyday life for people around the world. From searching on Google, to exchanging messages on Viber, to sharing photos on Snapchat \u2013 major online platforms shape numerous aspects of our lives. And while platforms largely facilitate, and even enable, our daily routines, the content hosted on them can be incredibly harmful, even dangerous.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Disinformation is shared, threats of violence are issued, harassment occurs, hate speech spreads, and even child sexual abuse material and violent content are circulated. The algorithms of certain online platforms not only enable but actively encourage harmful content. The reason for this is clear: the owners of major online platforms have their own interests, which in many cases come down to money. Some of the world\u2019s wealthiest individuals accumulated their capital largely through major online platforms (<\/span><a href=\"https:\/\/archive.vn\/ArvIl\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.vn\/dWak4\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Experiences from the region show that the rules of major online platforms are frequently violated, while platform enforcement mechanisms are slow or non-existent. On these platforms, the rights of women, children, and the LGBTIQA+ community are violated, disinformation spreads, all while the digital space remains almost entirely outside any form of regulation in the Western Balkans.<\/span><\/p>\n<h3><b>Without Response and Accountability<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">Major online platforms generally have complex sets of policies and community standards that are allegedly meant to be respected when using their services. However, the reality is that in most cases these rules are merely declarative. Their actual enforcement is lacking.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Reports of harmful content that violates platform rules are almost always met with no response. As our interviewees told us, if a response is received at all, it is usually a generic one stating that the content does not violate any platform rules. These are automated responses generated by a \u201cmachine\u201d. According to the interviewees, there is a lack of moderators who are familiar with the local context and language. This is especially the case with reports regarding hate speech. Proper analysis of such content requires knowledge of the local context, language, and even slang. Automated, <\/span><a href=\"https:\/\/zastone.ba\/en\/child-safety-online-are-major-platforms-doing-enough\/\"><span style=\"font-weight: 400;\">computer-based content review<\/span><\/a><span style=\"font-weight: 400;\"> will struggle to \u201cunderstand\u201d that content constitutes hate speech if it is analysed through computer-generated translation.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Automated moderation mechanisms also appear not to function properly in the Western Balkans. Due to the lack of local moderators, <\/span><a href=\"https:\/\/zastone.ba\/en\/invisibility-on-social-media-how-dependent-are-media-and-content-creators-on-online-platforms\/\"><span style=\"font-weight: 400;\">algorithms sometimes remove or flag content as harmful<\/span><\/a><span style=\"font-weight: 400;\"> even when that is not the case and <\/span><i><span style=\"font-weight: 400;\">vice versa<\/span><\/i><span style=\"font-weight: 400;\"> \u2013 failing to recognise content that genuinely is harmful.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">All of this leads to users increasingly <\/span><a href=\"https:\/\/zastone.ba\/en\/declarative-protection-real-violence-what-is-the-experience-of-the-lgbtiq-community-on-major-online-platforms\/\"><span style=\"font-weight: 400;\">giving up on reporting<\/span><\/a><span style=\"font-weight: 400;\"> content to platforms, since they do not trust platforms to respond. The process of reporting and documenting violations of platform rules and jeopardy of one\u2019s rights is often time-consuming and mentally demanding. It drains capacities that human rights defenders and civil society actors frequently lack.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Due to the difficulty of obtaining responses from platforms, users resort to improvised solutions. Within their communities, they call for <\/span><a href=\"https:\/\/zastone.ba\/en\/online-violence-are-major-platforms-doing-enough-to-protect-women\/\"><span style=\"font-weight: 400;\">coordinated reporting of content<\/span><\/a><span style=\"font-weight: 400;\"> in the hope of prompting faster platform action. Through <\/span><a href=\"https:\/\/zastone.ba\/en\/declarative-protection-real-violence-what-is-the-experience-of-the-lgbtiq-community-on-major-online-platforms\/\"><span style=\"font-weight: 400;\">personal contacts<\/span><\/a><span style=\"font-weight: 400;\">, they try to reach someone who works for, or knows someone who works for these companies, all in an effort to resolve violations of rules on platforms.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">None of this represents a systemic solution, which is the only way to ensure the integrity of the online environment. The adoption of legislation forcing major online platforms to ensure safe use and comply with certain rules is necessary.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition, on the Western Balkans, content that violates domestic legal frameworks cannot be reported to major online platforms on that legal basis. For example, denial of the genocide in Srebrenica, which is punishable by imprisonment under the <\/span><a href=\"https:\/\/archive.vn\/68F5T\"><span style=\"font-weight: 400;\">Criminal Code of Bosnia and Herzegovina<\/span><\/a><span style=\"font-weight: 400;\">, cannot be reported to platforms as a violation of that law. Narratives denying the Srebrenica genocide are therefore widely shared on social media, as analyses show (<\/span><a href=\"https:\/\/perma.cc\/8AWL-EM3S\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/perma.cc\/S2S5-6HNE\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">), while platforms are under no obligation to provide reporting mechanisms for such content.<\/span><\/p>\n<h3><b>Lack of Transparency Makes It Difficult to Prove the Problem<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">An additional layer of the problem is the lack of transparency of major online platforms, both regarding their actions and the data regarding their operations.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">As our interviewees shared with us, major online platforms, such as those owned by Meta, sometimes make decisions <\/span><a href=\"https:\/\/zastone.ba\/en\/invisibility-on-social-media-how-dependent-are-media-and-content-creators-on-online-platforms\/\"><span style=\"font-weight: 400;\">without providing clear explanations<\/span><\/a><span style=\"font-weight: 400;\">. Content may be removed, its reach reduced, or sharing disabled due to an alleged rule violation, without specifying which rule was breached. This significantly complicates users\u2019 ability to navigate platforms, as they are left to guess what the problem might be and how to avoid it in the future.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Content creators do not reliably know why some of their content achieved greater reach while other content reached far less users. Media outlets face the same issue. The work of both groups often depends on platform decisions, as platforms represent a source of income and, for media outlets in particular, a way for their stories to reach a wider audience.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Another aspect of this problem, as noted, is the <\/span><a href=\"https:\/\/zastone.ba\/en\/data-concealment-and-lack-of-transparency-do-online-platforms-enable-research-into-their-operations\/\"><span style=\"font-weight: 400;\">lack of platform transparency<\/span><\/a><span style=\"font-weight: 400;\"> regarding operational data. Essentially, in order to reliably demonstrate the impact of platforms across various areas of our lives, certain data are required; data that often only the platforms themselves can provide. However, platforms sometimes obstruct this process in order to protect their own interests.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Researchers therefore face barriers to accessing data, including complex verification processes and requirements to use expensive access tools. Platforms simply do not want to make their data easily available, as some research could reflect negatively on them. In-depth, complex studies of platform operations and impact could reveal findings that platforms would prefer to remain unknown.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, in an earlier case in Serbia, <\/span><a href=\"https:\/\/archive.vn\/TbErJ\"><span style=\"font-weight: 400;\">dating apps<\/span><\/a><span style=\"font-weight: 400;\"> refused to provide data on violations of their female users\u2019 rights. Clearly, evidence of widespread endangerment of women\u2019s safety facilitated by such platforms could seriously damage their image and market position. In the Western Balkans, however, there is no mechanism to compel them to release such data.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">At the same time, conducting such research would be crucial for advocating improvements in platform operations and for holding platforms accountable. Researchers from the Western Balkans, however, told us that due to a lack of financial and institutional support, they often abandon such research and redirect their work toward other, more accessible areas of research.<\/span><\/p>\n<h3><b>Toward Systemic Solutions<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">Despite all the problematic practices described above, major online platforms remain extremely important, especially for marginalised communities. The online space is often a place of socialisation for those who are vulnerable offline, such as LGBTIQ+ community. Achieving responsible and transparent operations by major online platforms is therefore a shared goal toward which actors from many sectors of our society should strive.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Regulation of online platforms, such as that introduced in the European Union through the <\/span><a href=\"https:\/\/archive.fo\/Brxjw\"><span style=\"font-weight: 400;\">Digital Services Act<\/span><\/a><span style=\"font-weight: 400;\">, requires a <\/span><i><span style=\"font-weight: 400;\">whole-of-society approach<\/span><\/i><span style=\"font-weight: 400;\">. This approach implies an active role for civil society, the academic community, media, institutions, and the private sector, alongside clear demands for accountability; both from online platforms and from decision-makers responsible for adopting and enforcing appropriate regulatory frameworks. Such a systemic approach cannot be reduced to normative solutions alone; it must also include strengthening the capacities of relevant actors and ensuring their long-term sustainability, enabling them to effectively participate in the implementation and oversight of future legal frameworks.<\/span><\/p>\n<p>(Marija \u0106osi\u0107 and Maida \u0106ulahovi\u0107, \u201cZa\u0161to ne\u201d)<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Although they have become an indispensable part of everyday life, major online platforms in the Western Balkans often fail to respond to hate speech, disinformation, and other harmful and illegal content. A lack of adequate moderation, non-transparent algorithms, and the absence of systemic regulation leave users and communities without protection. Photo: Za\u0161to ne For the &hellip; <a href=\"https:\/\/zastone.ba\/en\/accountability-of-major-online-platforms-harmful-content-weak-moderation-and-the-absence-of-regulation-in-the-western-balkans\/\">Continued<\/a><\/p>\n","protected":false},"author":44,"featured_media":13327,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[103,22],"tags":[],"class_list":["post-13334","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-digital-policy","category-zasto-ne-en"],"acf":[],"_links":{"self":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13334","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/users\/44"}],"replies":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/comments?post=13334"}],"version-history":[{"count":1,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13334\/revisions"}],"predecessor-version":[{"id":13335,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13334\/revisions\/13335"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/media\/13327"}],"wp:attachment":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/media?parent=13334"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/categories?post=13334"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/tags?post=13334"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}