{"id":13276,"date":"2025-11-17T16:15:02","date_gmt":"2025-11-17T15:15:02","guid":{"rendered":"https:\/\/zastone.ba\/?p=13276"},"modified":"2025-12-11T11:15:40","modified_gmt":"2025-12-11T10:15:40","slug":"child-safety-online-are-major-platforms-doing-enough","status":"publish","type":"post","link":"https:\/\/zastone.ba\/en\/child-safety-online-are-major-platforms-doing-enough\/","title":{"rendered":"Child Safety Online: Are Major Platforms Doing Enough?"},"content":{"rendered":"<p><span class=\"lead\" style=\"font-weight: 400;\">Despite policies that proclaim child protection, major online platforms often fail to respond to reports of violence, abuse, and harmful content or they respond far too late. And while algorithms are developed to generate profit by attracting and holding users\u2019 attention and encouraging interactions, children remain unprotected from the serious risks they encounter on social media.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-13238\" src=\"https:\/\/zastone.ba\/app\/uploads\/2025\/11\/djeca-kv-768x432.jpg\" alt=\"\" width=\"768\" height=\"432\" srcset=\"https:\/\/zastone.ba\/app\/uploads\/2025\/11\/djeca-kv-768x432.jpg 768w, https:\/\/zastone.ba\/app\/uploads\/2025\/11\/djeca-kv-1536x864.jpg 1536w, https:\/\/zastone.ba\/app\/uploads\/2025\/11\/djeca-kv.jpg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/p>\n<p><em>Photo: Za\u0161to ne<\/em><\/p>\n<p><i><span style=\"font-weight: 400;\">For the purposes of this article, interviews were conducted with Kristina Mihailovi\u0107 (<\/span><\/i><a href=\"https:\/\/roditelji.me\/udruzenje-roditelji\/\" target=\"_blank\" rel=\"noopener\"><i><span style=\"font-weight: 400;\">\u201cParents\u201d Association<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">), Adi Pejdah (<\/span><\/i><a href=\"https:\/\/www.sigurnodijete.ba\/\" target=\"_blank\" rel=\"noopener\"><i><span style=\"font-weight: 400;\">Centre for a Safer Internet<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">), and Sne\u017eana Nik\u010devi\u0107 (<\/span><\/i><a href=\"https:\/\/nvo35mm.me\/\" target=\"_blank\" rel=\"noopener\"><i><span style=\"font-weight: 400;\">NGO \u201c35mm\u201d<\/span><\/i><\/a><i><span style=\"font-weight: 400;\">).<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">A fifteen-year-old girl from one of the countries in the region endured online abuse for months. Secretly recorded videos of her were posted on social media, accompanied by lies, threats that she would be killed and even livestreams showing her location. Even after six months of online abuse, her parents, the police, and the competent institutions were unable to obtain additional information about the perpetrators from the platform where the content was posted, and there was no feedback whatsoever regarding the frequent reports they submitted.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Although major online platforms proclaim child protection in their policies (<\/span><a href=\"https:\/\/archive.fo\/O06PN\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/FkqFc\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/tyqtd\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">3<\/span><\/a><span style=\"font-weight: 400;\">), cases of different forms of endangerment of minors through their services are not rare. The responsiveness of platforms generally depends on the nature of the rules being violated and the type of abuse or misuse being reported. Additionally, according to our interviewees, platform response also depends on who submitted the report.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, cases of peer-to-peer violence are rarely or belatedly addressed. Reports usually have little effect. In many cases, platforms do not even send confirmation that the report was received. And even in those cases where platforms do eventually react, their responses often come too late &#8211; after harmful content has remained online for a long time, spread further, and caused numerous negative consequences in real life.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Users, hoping to trigger faster action, often call on more people to submit the same report. The reason is the widespread perception that platforms are more likely to respond when the same content is reported by multiple individuals. Meta, however, states in its rules that the <\/span><a href=\"https:\/\/archive.fo\/hUT4R\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">number of reports<\/span><\/a><span style=\"font-weight: 400;\"> does not affect their response, and that the same guidelines are applied in all cases.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Platforms tend to react more quickly and proactively to reports of the most extreme forms of content, such as sexually explicit material, sexual exploitation, and child abuse, especially when such reports are forwarded through police services, as Adi Pejdah from the Centre for a Safer Internet explained.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Associations dedicated to child online safety, such as the Centre for a Safer Internet, have established protocols for analysing and processing reported content that endangers children\u2019s safety. Following this protocol, they forward reports to competent police services, which then communicate with platforms through their designated contact points. Reports concerning content hosted on servers outside Bosnia and Herzegovina are forwarded to local safer internet centres that are members of the <\/span><a href=\"https:\/\/www.inhope.org\/EN\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">INHOPE<\/span><\/a><span style=\"font-weight: 400;\"> network.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, when the reported content does not contain elements of a criminal offence, there are no mechanisms for direct communication with platforms. The only remaining option for the Centre is to talk to parents and children, and in that way try to influence, for example, the removal and cessation of peer-to-peer online violence on social media.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Children are exposed to a wide range of risks on social media. When it comes to inappropriate content, children on platforms can easily access everything &#8211; from explicit pornography to content containing hate speech and incitement to violence (<\/span><a href=\"https:\/\/archive.fo\/SRgk7\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/iuAKt\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">). The design of algorithms used by major online platforms facilitates the spread of such content, while the restrictions that are meant to protect children, according to the platforms\u2019 own rules, are not effective enough to provide real safety.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Another major problem is the inefficiency of content-moderation systems, which also rely on algorithms to detect inappropriate content. After the killing of eight and wounding of 14 people in May 2023 by a minor K.K. in an elementary school in Belgrade, a large amount of prohibited and harmful content appeared on social media. Across different platforms, \u201cfan accounts\u201d dedicated to the perpetrator emerged, along with accounts impersonating him, messages of support and imitations (<\/span><a href=\"https:\/\/archive.fo\/c62qh\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/r01El\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/TJF45\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">3<\/span><\/a><span style=\"font-weight: 400;\">). Although the victims were minors, their photos and identities were also heavily exploited. According to <\/span><a href=\"https:\/\/archive.fo\/EFrAi\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">media reports<\/span><\/a><span style=\"font-weight: 400;\">, filters featuring the victims\u2019 faces also have appeared on TikTok. Despite the enormous public attention and the fact that seven children lost their lives, moderation measures on social media were clearly not intensified.<\/span><\/p>\n<h3><b>How do children actually use platforms?<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">The terms of use of major online platforms, such as those run by <\/span><a href=\"https:\/\/ghostarchive.org\/archive\/Ocb7d\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Meta<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/mBHQ3\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">TikTok<\/span><\/a><span style=\"font-weight: 400;\"> and <\/span><a href=\"https:\/\/archive.fo\/Y3Hs5\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Snapchat<\/span><\/a><span style=\"font-weight: 400;\">, state that the minimum age for independently creating an account and using the platform is 13. <\/span><a href=\"https:\/\/archive.fo\/pJGS8\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">TikTok<\/span><\/a><span style=\"font-weight: 400;\">, for instance, offers a \u201cseparate TikTok experience designed specifically for younger users\u201d in the United States. However, enforcement of these restrictions has proven inadequate.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Primarily, all that <\/span><a href=\"https:\/\/web.archive.org\/web\/20251022131307\/https:\/\/sumsub.com\/blog\/age-verification-on-social-media\/\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">many platforms<\/span><\/a><span style=\"font-weight: 400;\"> require as age verification is that the user self-declares their age. Therefore, in many cases all children under 13 need to do is click that they are older and they are able to create an account. Age-verification mechanisms that involve the use of identity documents, biometric data or AI tools raise additional concerns related to the protection of users\u2019 personal data (<\/span><a href=\"https:\/\/archive.fo\/qFtU1\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/web.archive.org\/web\/20251022131307\/https:\/\/sumsub.com\/blog\/age-verification-on-social-media\/\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">).<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Children younger than the permitted age successfully bypass these restrictions (<\/span><a href=\"https:\/\/archive.fo\/MYYcK\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">1<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/qtCPt\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">2<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><a href=\"https:\/\/archive.fo\/Gqg4e\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">3<\/span><\/a><span style=\"font-weight: 400;\">). A <\/span><a href=\"https:\/\/archive.fo\/Gqg4e\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">study conducted this year in Australia<\/span><\/a><span style=\"font-weight: 400;\">, for instance, showed that as many as 84% of children aged 8 to 12 used some type of social-media service, with half of them using accounts belonging to their parents or guardians. One-third of the children in the study used their own social media accounts despite platform rules. In 80% of those cases, children had help from their parents\/guardians in creating the accounts.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">According to our interviewees, parents and guardians often are not aware of the risks associated with children\u2019s use of online platforms. More often, they are concerned about the amount of time their children spend online, rather than the content they consume. As a result, due to a lack of awareness about potential risks, they frequently help children create and use social media accounts in violation of platform rules. Another issue is their limited understanding of how parental-control tools work, as well as generally low levels of digital literacy.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Parents most often react only after negative consequences occur. Lacking better solutions, they support bans. Our interviewees note that the Montenegrin public largely welcomed the news of a one-year <\/span><a href=\"https:\/\/archive.fo\/C5BxD\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">ban on TikTok in Albania<\/span><\/a><span style=\"font-weight: 400;\">. At the same time, this decision was criticised by experts for <\/span><a href=\"https:\/\/archive.fo\/6TzZL\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">limiting freedom of expression<\/span><\/a><span style=\"font-weight: 400;\"> and <\/span><a href=\"https:\/\/archive.fo\/8YT09\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">violating European principles<\/span><\/a><span style=\"font-weight: 400;\"> of regulating digital platforms. Moreover, banning a single platform does not solve the problem, as harmful content can easily appear elsewhere. As Sne\u017eana Nik\u010devi\u0107 warns, the most problematic types of content have already moved from social media to closed user groups or other channels where private content exchange prevails, such as Snapchat and similar platforms.<\/span><\/p>\n<h3><b>The Responsibility to Protect Children Online Also Lies With Platforms<\/b><\/h3>\n<p><span style=\"font-weight: 400;\">Instead of introducing bans, legal measures should focus on obliging platforms to implement child-protection measures against online dangers and risks such as harassment, abuse, or exposure to inappropriate content. In the European Union, the <\/span><a href=\"https:\/\/archive.fo\/Brxjw\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Digital Services Act<\/span><\/a><span style=\"font-weight: 400;\"> obliges major online platforms to regularly assess potential risks to children and young people and to implement measures to mitigate those risks. However, the mere availability of parental-control tools or age-verification systems is not sufficient.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">What the EU\u2019s regulatory framework particularly emphasises is the obligation of platforms to respond promptly to reports of illegal and harmful content. Trusted flaggers, organisations granted special status for identifying and reporting illegal or rule-violating content, can play an important role. Including such a mechanism in domestic legislation in the region would allow organisations working on child-online-safety issues and possessing the necessary expertise to submit reports directly to platforms, while also imposing a legal obligation on platforms to treat those reports as a priority.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ultimately, online platforms are the ones responsible for protecting children as one of the most vulnerable groups in society. That responsibility must not fall solely on parents and guardians.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">(Marija \u0106osi\u0107 i Maida \u0106ulahovi\u0107, \u201cZa\u0161to ne\u201d)<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Despite policies that proclaim child protection, major online platforms often fail to respond to reports of violence, abuse, and harmful content or they respond far too late. And while algorithms are developed to generate profit by attracting and holding users\u2019 attention and encouraging interactions, children remain unprotected from the serious risks they encounter on social &hellip; <a href=\"https:\/\/zastone.ba\/en\/child-safety-online-are-major-platforms-doing-enough\/\">Continued<\/a><\/p>\n","protected":false},"author":44,"featured_media":13238,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[103,22],"tags":[],"class_list":["post-13276","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-digital-policy","category-zasto-ne-en"],"acf":[],"_links":{"self":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13276","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/users\/44"}],"replies":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/comments?post=13276"}],"version-history":[{"count":1,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13276\/revisions"}],"predecessor-version":[{"id":13277,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/posts\/13276\/revisions\/13277"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/media\/13238"}],"wp:attachment":[{"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/media?parent=13276"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/categories?post=13276"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/zastone.ba\/en\/wp-json\/wp\/v2\/tags?post=13276"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}