Why the Western Balkans Need the Digital Services Act

DSA
Data Insight

By: Maida Salkanović

In Knjaževac, Serbia, Adrijana Petković’s life changed when a Telegram bot powered by artificial intelligence transformed an innocent bathroom selfie into a hyperrealistic nude image. The photo, created without her consent, circulated rapidly through local Telegram groups. The case, first reported by the Balkan Investigative Reporting Network (BIRN), drew attention to the emotional toll and reputational damage caused by such abuse, and to the stark absence of legal remedies. Serbia’s laws remain silent on deepfake pornography. Her story is not unique. It is, in fact, a stark illustration of a legal vacuum across the Western Balkans, one the EU’s Digital Services Act (DSA) could help fill, if only these countries were willing and able to adopt its principles.

A New Generation of Online Harm

The violation Petković experienced is becoming alarmingly common in the Western Balkans, following global trends. Fueled by advances in generative Artificial Intelligence (AI) and enabled by opaque platform policies, non-consensual deepfake pornography and other forms of synthetic abuse are spreading with little oversight. New AI video generators now make it possible to create hyperrealistic, pornographic content from just a single photo and a vague prompt, often bypassing built-in safety filters. Although many platforms claim to prohibit such content, enforcement remains weak, especially in countries like Serbia and Bosnia and Herzegovina, where institutional safeguards are limited or nonexistent.

This regulatory gap is particularly dangerous amid Meta’s recent rollback of moderation policies, including the removal of fact-checkers and relaxed hate speech enforcement. As Maja Sever, president of the European Federation of Journalists, told SEE Check, “Content regulation on social media is becoming increasingly complex due to the intertwining of political interests and the influence of large tech companies.” 

The DSA’s Blueprint

By contrast, the European Union’s Digital Services Act (DSA), adopted in 2022, offers a rights-based framework for managing digital harms. It introduces obligations for algorithmic accountability, transparency in moderation, and risk assessments—especially for Very Large Online Platforms (VLOPs). Unlike many national efforts, the DSA doesn’t target individual content or users but instead holds platforms accountable for systemic risks, such as disinformation, gender-based abuse, and threats to democratic processes.

Croatia’s organization GONG sees this shift in responsibility as essential. As Svjetlana Knežević explained, “This regulation aims to place responsibility on large platforms for the content they host, not just on users who produce or share it. As anonymous and political actors increasingly use social media to spread lies and manipulate elections, it’s crucial to impose obligations on Big Tech to proactively and continuously remove false information and curb the reach of those who spread it.”

However, none of the Western Balkan countries have adopted DSA-aligned legislation, leaving a fragmented landscape vulnerable to both corporate neglect and government overreach.

A Region Still Unprepared

Despite some initial interest from countries like Albania and Bosnia and Herzegovina in establishing oversight bodies similar to those outlined in the Digital Services Act (DSA), no country in the Western Balkans has taken concrete steps to implement the DSA comprehensively. According to a multi-country study by Partners for Democratic Change Serbia, efforts to align national legislation with the DSA’s principles have been fragmented and incomplete. Instead of adopting a unified regulatory framework that addresses platform accountability, algorithmic transparency, and user protections in a systemic manner, these countries have only made isolated updates to existing laws, such as those governing audiovisual media or consumer protection. These limited reforms fail to address the broader, more comprehensive regulatory model proposed by the DSA, leaving significant gaps in oversight and enforcement.

The study underscores that while basic protections exist, such as limited intermediary liability and data protection, social networking platforms and search engines remain largely unregulated. Many of the same loopholes that enabled the spread of manipulated images in Petković’s case persist throughout the region.

According to Snežana Bajčeta from the Serbian digital rights organization SHARE, “The effectiveness of digital regulation in the region is questionable because the digital market here is underdeveloped and non-competitive compared to Europe’s.” She stresses that joining the EU’s Digital Single Market would be a crucial step forward, enabling the creation of consistent rules and standards needed for the realization of digital rights. 

“Only in a well-regulated environment with clear rules and standards can we talk about freedoms and the potential for fully realizing them,” Bajčeta notes.

Even where civil society initiatives exist, political will remains weak. Bosnia’s Coalition for Freedom of Expression and Content Moderation, coordinated by Mediacentar, has submitted an initiative to the Ministry of Communications and Transport seeking to develop a harmonization strategy. However, as Mediacentar told SEE Check, work on the document has not yet started.

Legal expert Nasir Muftić from the University of Sarajevo echoes the concern: “We don’t have a developed legal framework in this regard, mainly due to a lack of political will. We’ve failed to address even simpler problems, like regulating electronic communications and emerging technologies, which remain outside our core communications law.”

The AI Disinformation Spiral

The rise of generative AI brings new challenges not only for privacy but for the integrity of public discourse. According to Maida Ćulahović, digital rights expert from Why Not, the risks tied to generative AI, including deepfakes, fake bot networks, and algorithmic flows that amplify disinformation, must be addressed through regulation that understands the systemic nature of digital manipulation.

This need is especially urgent as platforms show little interest in adapting content moderation to the region. As the Coalition in Bosnia points out, mechanisms for reporting and appealing harmful content are often broken, poorly localized, or entirely absent.

“The development of new technologies, such as artificial intelligence, certainly contributes to the complexity of these challenges and the potential responses to them,” they note.

Bajčeta also emphasizes that civil society remains the main driver of digital policy innovation. Still, she warns that real progress requires a “comprehensive, structural approach to disinformation production and dissemination, grounded in effective mechanisms for suppression,” instead of ad hoc responses.

The DSA’s Promise—and Its Absence

The DSA introduces exactly this kind of structural model. It shifts the burden away from users like Adrijana Petković and places it on platforms to identify and mitigate risk. It calls for independent oversight bodies—Digital Services Coordinators (DSCs)—with the authority to demand compliance and transparency.

In Croatia, this role was assigned to the Croatian Regulatory Authority for Network Industries (HAKOM). As the national DSC, HAKOM is now responsible for coordinating efforts to regulate digital services and ensure compliance with the DSA. This includes working closely with other state bodies and maintaining international cooperation with the European Commission and EU DSCs. HAKOM has been granted investigative and enforcement powers over service providers in Croatia and is actively clarifying its role to the public, noting that “people still mistakenly believe the coordinator’s job is to manually search social media for harmful content,” as it stated in an email to SEE Check.

However, while HAKOM’s jurisdiction covers Croatia, content originating from neighboring countries that share the same language, such as Bosnia and Herzegovina, Serbia, or Montenegro, remains a critical blind spot. Content hosted by service providers in these non-EU countries can still reach Croatian audiences, potentially circumventing the DSA’s regulations. As HAKOM explains, while Croatian authorities can issue orders to platforms like Facebook under Article 9 of the DSA, the scope of enforcement is limited to Croatia and potentially other EU member states. “The fact that the DSA does not apply to service providers based in, for example, Bosnia and Herzegovina, would certainly affect the ability to seek legal protection under the DSA,” HAKOM states.

Without independent, well-resourced DSCs in the Western Balkans, and without harmonized legislation, the region risks repeating the fragmented and reluctant implementation seen during the rollout of the General Data Protection Regulation (GDPR).

As Nasir Muftić from the University of Sarajevo points out, it’s not only about local capacity but also structural power: “The question is whether a small country can truly regulate the tech giants. That’s the core issue behind it all.” He argues that coordinated, supranational enforcement—through the EU and aligned legal mechanisms—is crucial. “I don’t believe individual states, no matter how much effort or resources they invest, can achieve the same effect on their own.”

To avoid a growing digital divide, civil society, academia, and independent media must be included in policymaking from the start.

From Legal Vacuum to Platform Impunity

The case of Adrijana Petković and the surge in image-based sexual abuse across the Western Balkans show the urgent need for comprehensive, enforceable digital regulation. The DSA provides a model for rights-based, systemic oversight, but it will only work if countries in the region summon the political will and institutional capacity to act.

In countries like Bosnia and Herzegovina, Serbia, Montenegro, and North Macedonia, current legal frameworks are fragmented, outdated, and ill-equipped to respond to the speed, scale, and sophistication of today’s digital threats. As highlighted by civil society organizations and legal experts, this vacuum enables both inaction from platforms and harmful overreactions from governments – from vague public order laws that criminalize “disinformation” to politically motivated content moderation.

Correction: An earlier version of this article misstated the organization coordinating the Coalition for Freedom of Expression and Content Moderation. The correct organization is Mediacentar, not Why Not.

Follow us on social media:

Contact: