Two weeks ago, a leaked audio recording obtained by the Organized Crime and Reporting Project and Serbian investigative outlet KRIK confirmed that the CEOs of the state-owned Telekom Srbija and the Netherlands-based United Group discussed plans to remove the chief executive of United Media, a subsidiary of United Group that operates several outlets in the country, including the leading opposition outlet N1 — potentially at the behest of Serbian president Alexander Vucic.
This has raised serious concerns over the state of media freedom in Serbia — especially as independent journalists are already facing increased violence for reporting on ongoing anti-corruption protests.
At this perilous moment for media freedom, Serbia, an EU candidate, is preparing to adopt three of the EU’s digital regulations: the Digital Services Act (DSA), the Artificial Intelligence Act (AIA), and the Media Freedom Act (MFA).
The DSA governs online platforms, such as Facebook, WhatsApp, Instagram, and TikTok.
The AIA will categorise AI systems by the risk they present to fundamental rights and require levels of transparency and accountability for the developer and the deployer.
The MFA requires additional steps from online platforms when moderating content from media service providers. All three laws will apply when social media companies use AI to help with content moderation.
While the MFA aims to strengthen media freedom by ensuring plurality and access to reliable information, the DSA and the AI Act aim to create safer and more transparent online spaces while protecting citizens' fundamental rights.
And an essential part of the DSA’s effective implementation is the designation of an independent Digital Services Coordinator (DSC), a national authority tasked with overseeing platform compliance. The DSC will designate and oversee trusted flaggers, individuals who report illegal content directly to platforms. Reports from these people will be given special priority, as social media platforms are obligated to create dedicated channels to process them efficiently.
As the Center for News, Technology & Innovation (CNTI) has previously established, however, oversight authorities have a great deal of power that can lead to unintended consequences. For the DSA to be implemented in the spirit of the EU’s goal, the DSC, and therefore the trusted flaggers, must not be influenced by the government.
However, meeting sufficient levels of independence from the ruling party will be challenging in Serbia.
Early indications from the Serbian government suggest that the Regulatory Authority for Electronic Media (REM) is likely to become the country’s DSC.
This appointment has raised significant concerns among civil society organisations, independent journalists, and international observers, who warn that the REM lacks independence from the ruling Serbian Progressive Party (SNS) and has already contributed to the country’s decline in press freedom.
Political and business elites already control or influence the majority of mainstream media outlets, which they have used to spread political disinformation during the protests
Since November 2024, the Serbian government has responded to massive student-led, pro-democracy protests and blockades with greater restrictions on media freedom.
Political and business elites already control or influence the majority of mainstream media outlets, which they have used to spread political disinformation during the protests.
As a result, independent journalists, whistleblowers, and civil society activists increasingly rely on digital platforms to reach the public and disseminate information that traditional media may ignore or suppress.
If the REM gains control over digital platforms through the DSA, however, it could require companies to delete legitimate content that does not fit the party line, and independent news media could lose social media as a vector of communication.
Therefore, in the current Serbian context, there is a risk that EU legislation could be misused as an instrument of repression.
For Serbia’s implementation of the DSA to have a positive impact, the EU can take steps to guarantee that its digital policies strengthen, not undermine, freedom of expression, media freedom, and democratic resilience.
One way this could be achieved is through the integration of Serbia and other Western Balkan countries into the EU Digital Single Market before EU accession, which is a step Serbian civil society organisations, such as Partners Serbia, advocate for.
By joining the EU Digital Single Market, the EU would have a larger oversight role than it does for countries that are candidate members for full membership in the EU.
Another possible route to preserve the intended goals of these EU policies is the intersection of multiple policy instruments.
The AI Act’s human oversight requirements can help protect the information environment if the independence of local DSCs is compromised.
Under the AIA’s risk classification system, automated content moderation systems used by Very Large Online Platforms (VLOPs) are likely to be designated as high-risk applications due to their potential to significantly affect individuals’ rights, such as freedom of expression, access to information, and non-discrimination.
Consequently, these systems will be subject to additional requirements under the AIA, including the need for the online platforms to create risk management frameworks, ensure quality data governance, and perform effective human oversight.
Meaningful human oversight can be helpful in environments lacking adherence to democratic principles, where regulatory authorities may be captured or lack independence.
If the government influences the DSC, there is a heightened risk that content moderation practices, including notices sent by trusted flaggers, might be used as a tool for political censorship rather than in the public interest. By including human oversight, the content from trusted flaggers will not only be reviewed by AI systems, but by people at the social media companies, too, before it is deleted.
An additional safeguard for media services using online platforms is introduced in the MFA. One of its key objectives is to prevent the removal of legitimate content published by media service providers. Article 18 of the MFA requires that VLOPs treat media service providers with preferential consideration during content moderation.
Article 17 adds further protection: if a VLOP intends to restrict content from a media service provider, it must inform the provider at least 24 hours in advance and allow them to present their case.
For countries like Serbia, these provisions could play a critical role in safeguarding independent or government-critical media outlets from excessive takedowns under the DSA.
However, there are risks.
The status of “media service provider” is determined by self-declaration. (VLOPs can challenge this status, but they would have to refer to the national regulatory authority, which in Serbia would be REM. The regulatory authority cannot initiate these challenges, so it would be difficult for them to use this policy tool to penalise dissident media.)
As a result, actors wishing to spread disinformation or illegal content could exploit this loophole, allowing content that would normally violate community guidelines to remain online for at least 24 hours, if not longer.
In other words, the MFA is more likely to protect media content that violates guidelines in an environment like Serbia’s than to remove media content that does not.
The implementation of the DSA in Serbia and other countries lacking an independent DSC risks becoming an EU-based tool for state repression, enabling ruling parties to suppress dissent under the guise of compliance.
However, admission to the EU Digital Single Market and the interplay of the DSA, the AI Act, and the MFA can serve as safeguards, helping prevent the removal of legitimate content and protecting critical voices and independent media in Serbia — thus helping Serbia and other Western Balkan nations realise the democratic goals of the DSA.
Every month, hundreds of thousands of people read the journalism and opinion published by EUobserver. With your support, millions of others will as well.
If you're not already, become a supporting member today.
Emily Wright is a research assistant at the Center for News, Technology & Innovation. She was previously a Fulbright research fellow and research associate at Partners Serbia in Belgrade, where she focused on the implications of new digital policies for civil society activists and investigative journalists.
Ana Toskić Cvetinović is the executive director of Partners Serbia.
Milica Tošić is a legal advisor at Partners Serbia.
Emily Wright is a research assistant at the Center for News, Technology & Innovation. She was previously a Fulbright research fellow and research associate at Partners Serbia in Belgrade, where she focused on the implications of new digital policies for civil society activists and investigative journalists.
Ana Toskić Cvetinović is the executive director of Partners Serbia.
Milica Tošić is a legal advisor at Partners Serbia.