{"id":11102,"date":"2025-05-09T15:42:22","date_gmt":"2025-05-09T14:42:22","guid":{"rendered":"https:\/\/seecheck.org\/?p=11102"},"modified":"2025-11-28T13:50:55","modified_gmt":"2025-11-28T12:50:55","slug":"why-the-western-balkans-need-the-digital-services-act","status":"publish","type":"post","link":"https:\/\/seecheck.org\/index.php\/2025\/05\/09\/why-the-western-balkans-need-the-digital-services-act\/","title":{"rendered":"Why the Western Balkans Need the Digital Services Act"},"content":{"rendered":"\n<p>In Knja\u017eevac, Serbia, <strong>Adrijana Petkovi\u0107\u2019s<\/strong> life changed when a Telegram bot powered by artificial intelligence transformed an innocent bathroom selfie into a hyperrealistic nude image. The photo, created without her consent, circulated rapidly through local Telegram groups. The case, first <a href=\"https:\/\/birn.rs\/osvetnicka-pornografija-botovi-za-skidanje-odece\/\">reported by the Balkan Investigative Reporting Network (BIRN)<\/a>, drew attention to the emotional toll and reputational damage caused by such abuse, and to the stark absence of legal remedies. Serbia\u2019s laws remain silent on deepfake pornography. Her story is not unique. It is, in fact, a stark illustration of a legal vacuum across the Western Balkans, one the EU\u2019s Digital Services Act (DSA) could help fill, if only these countries were willing and able to adopt its principles.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>A New Generation of Online Harm<\/strong><\/h3>\n\n\n\n<p>The violation Petkovi\u0107 experienced is becoming alarmingly common in the Western Balkans, following global trends. Fueled by advances in generative Artificial Intelligence (AI) and enabled by opaque platform policies, non-consensual deepfake pornography and other forms of synthetic abuse are <a href=\"https:\/\/www.404media.co\/email\/f7ed39a2-7f04-466c-b09c-986af4332f1b\/?ref=daily-stories-newsletter\">spreading with little oversight<\/a>. New AI video generators now make it possible to create hyperrealistic, pornographic content from just a single photo and a vague prompt, often bypassing built-in safety filters. Although many platforms claim to prohibit such content, enforcement remains weak, especially in countries like Serbia and Bosnia and Herzegovina, where institutional safeguards are limited or nonexistent.<\/p>\n\n\n\n<p>This regulatory gap is particularly dangerous amid Meta\u2019s recent <a href=\"https:\/\/about.fb.com\/news\/2025\/01\/meta-more-speech-fewer-mistakes\/\">rollback of moderation policies<\/a>, including the removal of fact-checkers and relaxed hate speech enforcement. As Maja Sever, president of the European Federation of Journalists, told SEE Check, \u201cContent regulation on social media is becoming increasingly complex due to the intertwining of political interests and the influence of large tech companies.\u201d&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The DSA\u2019s Blueprint<\/strong><\/h3>\n\n\n\n<p>By contrast, the <strong>European Union\u2019s Digital Services Act (DSA)<\/strong>, adopted in 2022, offers a rights-based framework for managing digital harms. It introduces obligations for algorithmic accountability, transparency in moderation, and risk assessments\u2014especially for Very Large Online Platforms (VLOPs). Unlike many national efforts, the DSA doesn\u2019t target individual content or users but instead holds platforms accountable for systemic risks, such as disinformation, gender-based abuse, and threats to democratic processes.<\/p>\n\n\n\n<p>Croatia\u2019s organization <strong>GONG<\/strong> sees this shift in responsibility as essential. As <strong>Svjetlana Kne\u017eevi\u0107<\/strong> explained, \u201cThis regulation aims to place responsibility on large platforms for the content they host, not just on users who produce or share it. As anonymous and political actors increasingly use social media to spread lies and manipulate elections, it\u2019s crucial to impose obligations on Big Tech to proactively and continuously remove false information and curb the reach of those who spread it.\u201d<\/p>\n\n\n\n<p>However, none of the Western Balkan countries have adopted DSA-aligned legislation, leaving a fragmented landscape vulnerable to both corporate neglect and government overreach.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>A Region Still Unprepared<\/strong><\/h3>\n\n\n\n<p>Despite some initial interest from countries like Albania and Bosnia and Herzegovina in establishing oversight bodies similar to those outlined in the Digital Services Act (DSA), <strong>no country in the Western Balkans has taken concrete steps to implement the DSA comprehensively<\/strong>. According to <a href=\"https:\/\/www.partners-serbia.org\/public\/news\/dsa-wb-new.pdf\">a multi-country study<\/a> by <strong>Partners for Democratic Change Serbia<\/strong>, efforts to align national legislation with the DSA\u2019s principles have been fragmented and incomplete. Instead of adopting a unified regulatory framework that addresses platform accountability, algorithmic transparency, and user protections in a systemic manner, these countries have only made isolated updates to existing laws, such as those governing audiovisual media or consumer protection. These limited reforms fail to address the broader, more comprehensive regulatory model proposed by the DSA, leaving significant gaps in oversight and enforcement.<\/p>\n\n\n\n<p>The study underscores that while basic protections exist, such as limited intermediary liability and data protection, <strong>social networking platforms and search engines remain largely unregulated<\/strong>. Many of the same loopholes that enabled the spread of manipulated images in Petkovi\u0107\u2019s case persist throughout the region.<\/p>\n\n\n\n<p>According to <strong>Sne\u017eana Baj\u010deta<\/strong> from the Serbian digital rights organization <strong>SHARE<\/strong>, \u201cThe effectiveness of digital regulation in the region is questionable because the digital market here is underdeveloped and non-competitive compared to Europe\u2019s.\u201d She stresses that joining the EU\u2019s Digital Single Market would be a crucial step forward, enabling the creation of consistent rules and standards needed for the realization of digital rights.&nbsp;<\/p>\n\n\n\n<p>\u201cOnly in a well-regulated environment with clear rules and standards can we talk about freedoms and the potential for fully realizing them,\u201d Baj\u010deta notes.<\/p>\n\n\n\n<p>Even where civil society initiatives exist, political will remains weak. Bosnia\u2019s <strong>Coalition for Freedom of Expression and Content Moderation<\/strong>, coordinated by <strong>Mediacentar<\/strong>, has submitted an initiative to the Ministry of Communications and Transport seeking to develop a harmonization strategy. However, as Mediacentar told SEE Check, work on the document has not yet started.<\/p>\n\n\n\n<p>Legal expert <strong>Nasir Mufti\u0107<\/strong> from the <strong>University of Sarajevo<\/strong> echoes the concern: \u201cWe don\u2019t have a developed legal framework in this regard, mainly due to a lack of political will. We\u2019ve failed to address even simpler problems, like regulating electronic communications and emerging technologies, which remain outside our core communications law.\u201d<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The AI Disinformation Spiral<\/strong><\/h3>\n\n\n\n<p>The rise of generative AI brings new challenges not only for privacy but for the integrity of public discourse. According to <strong>Maida \u0106ulahovi\u0107<\/strong>, digital rights expert from Why Not, the risks tied to generative AI, including deepfakes, fake bot networks, and algorithmic flows that amplify disinformation, must be addressed through regulation that understands the systemic nature of digital manipulation.<\/p>\n\n\n\n<p>This need is especially urgent as platforms show little interest in adapting content moderation to the region. As the Coalition in Bosnia points out, mechanisms for reporting and appealing harmful content are often broken, poorly localized, or entirely absent.<\/p>\n\n\n\n<p>\u201cThe development of new technologies, such as artificial intelligence, certainly contributes to the complexity of these challenges and the potential responses to them,\u201d they note.<\/p>\n\n\n\n<p>Baj\u010deta also emphasizes that civil society remains the main driver of digital policy innovation. Still, she warns that real progress requires a \u201ccomprehensive, structural approach to disinformation production and dissemination, grounded in effective mechanisms for suppression,\u201d instead of ad hoc responses.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The DSA\u2019s Promise\u2014and Its Absence<\/strong><\/h3>\n\n\n\n<p>The DSA introduces exactly this kind of structural model. It shifts the burden away from users like Adrijana Petkovi\u0107 and places it on platforms to identify and mitigate risk. It calls for independent oversight bodies\u2014Digital Services Coordinators (DSCs)\u2014with the authority to demand compliance and transparency.<\/p>\n\n\n\n<p>In Croatia, this role was assigned to the <strong>Croatian Regulatory Authority for Network Industries (HAKOM). <\/strong>As the national DSC, HAKOM is now responsible for coordinating efforts to regulate digital services and ensure compliance with the DSA. This includes working closely with other state bodies and maintaining international cooperation with the European Commission and EU DSCs. HAKOM has been granted investigative and enforcement powers over service providers in Croatia and is actively clarifying its role to the public, noting that <strong>\u201cpeople still mistakenly believe the coordinator\u2019s job is to manually search social media for harmful content,\u201d <\/strong>as it stated in an email to SEE Check.<\/p>\n\n\n\n<p>However, while HAKOM\u2019s jurisdiction covers Croatia, content originating from neighboring countries that share the same language, such as Bosnia and Herzegovina, Serbia, or Montenegro, remains a critical blind spot. Content hosted by service providers in these non-EU countries can still reach Croatian audiences, potentially circumventing the DSA\u2019s regulations. As HAKOM explains, while Croatian authorities can issue orders to platforms like Facebook under Article 9 of the DSA, the scope of enforcement is limited to Croatia and potentially other EU member states. <strong>\u201cThe fact that the DSA does not apply to service providers based in, for example, Bosnia and Herzegovina, would certainly affect the ability to seek legal protection under the DSA,\u201d<\/strong> HAKOM states.<\/p>\n\n\n\n<p>Without independent, well-resourced DSCs in the Western Balkans, and without harmonized legislation, the region risks repeating the fragmented and reluctant implementation seen during the rollout of the General Data Protection Regulation (GDPR).<\/p>\n\n\n\n<p>As Nasir Mufti\u0107 from the University of Sarajevo points out, it\u2019s not only about local capacity but also structural power: \u201cThe question is whether a small country can truly regulate the tech giants. That\u2019s the core issue behind it all.\u201d He argues that coordinated, supranational enforcement\u2014through the EU and aligned legal mechanisms\u2014is crucial. \u201cI don\u2019t believe individual states, no matter how much effort or resources they invest, can achieve the same effect on their own.\u201d<\/p>\n\n\n\n<p>To avoid a growing digital divide, civil society, academia, and independent media must be included in policymaking from the start.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>From Legal Vacuum to Platform Impunity<\/strong><\/h3>\n\n\n\n<p>The case of Adrijana Petkovi\u0107 and the surge in image-based sexual abuse across the Western Balkans show the urgent need for comprehensive, enforceable digital regulation. The DSA provides a model for rights-based, systemic oversight, but it will only work if countries in the region summon the political will and institutional capacity to act.<\/p>\n\n\n\n<p>In countries like Bosnia and Herzegovina, Serbia, Montenegro, and North Macedonia, current legal frameworks are fragmented, outdated, and ill-equipped to respond to the speed, scale, and sophistication of today\u2019s digital threats. As highlighted by civil society organizations and legal experts, this vacuum enables both inaction from platforms and harmful overreactions from governments &#8211; from vague public order laws that criminalize \u201cdisinformation\u201d to politically motivated content moderation.<\/p>\n\n\n\n<p><em><strong>Correction:<\/strong>&nbsp;An earlier version of this article misstated the organization coordinating the Coalition for Freedom of Expression and Content Moderation. The correct organization is Mediacentar, not Why Not.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In Knja\u017eevac, Serbia, Adrijana Petkovi\u0107\u2019s life changed when a Telegram bot powered by artificial intelligence transformed an innocent bathroom selfie into a hyperrealistic nude image. The photo, created without her consent, circulated rapidly through local Telegram groups. The case, first reported by the Balkan Investigative Reporting Network (BIRN), drew attention to the emotional toll and [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":11098,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[4],"tags":[231,230,37,29,573,419,572,205,35,28],"class_list":["post-11102","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles","tag-ai","tag-artificial-intelligence","tag-bosnia-and-herzegovina","tag-croatia","tag-digital-services-act","tag-dsa","tag-hakom","tag-istaknuto","tag-montenegro","tag-serbia"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/posts\/11102","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/comments?post=11102"}],"version-history":[{"count":3,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/posts\/11102\/revisions"}],"predecessor-version":[{"id":11910,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/posts\/11102\/revisions\/11910"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/media\/11098"}],"wp:attachment":[{"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/media?parent=11102"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/categories?post=11102"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/seecheck.org\/index.php\/wp-json\/wp\/v2\/tags?post=11102"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}