Opinion
EU 'all bark and no bite' on disinformation
"The dogs bark but the caravan moves on", goes the old Arabic proverb. The European Parliament elections are done and dusted, voter turnout went up and all seems fine.
The Brussels elite can finally devote time to real issues such as the semantics of the titles of new commissioner portfolios.
Join EUobserver today
Become an expert on Europe
Get instant access to all articles — and 20 years of archives. 14-day free trial.
Choose your plan
... or subscribe as a group
Already a member?
Disinformation and election interference, some of the most pressing worries earlier in 2019, seem settled and perhaps not deserving our attention.
However, a recent study by the Oxford Internet Institute begs to differ. Disinformation is alive and well. There is evidence of organised social media campaigns in 70 countries globally this year, up from 48 in 2018.
One of the main conclusions is that "computational propaganda is becoming a pervasive and ubiquitous part of everyday life."
It also seems that the list of suspects orchestrating foreign influence campaigns is growing. The likes of China, Iran, India, Saudi Arabia are also tapping into Russia's disinfo playbook.
Manufacturing consensus
The situation is exacerbated by the fact that online propaganda tools are not only used by foreign actors. Domestic political or civil organisations are employing strategies for antagonising social groups, suppressing voter turn-out or discrediting political opponents.
In many cases, these internal disinfo campaigns do not share fake content per se but promote particular stories or catchy memes in order to contribute to a specific narrative.
As a result, radical or divisive content becomes over-represented and creates the impression that such opinions are shared by many. Essentially, these are conscious strategies for manufacturing consensus and reinforcing strong emotions and political identities.
Worst still, specific tools for paid targeted advertisement offered by digital companies make sure that these divisive narratives reach audiences which are the most susceptible to them.
Smart ads can be so effective that we now wonder whether we can agree even on basic facts. Talk about getting a bang for your buck.
Much Ado About Nothing
Last year the European Commission pushed tech giants in a landmark self-regulatory Code of Practice on Disinformation.
Throughout 2019 the Commission provided monthly reports on the progress of digital platforms to increase transparency in political advertising and share their progress on ensuring the integrity of their services against disinformation.
Progress is lukewarm. Facebook boasted the creation of its new ad library in a move to increase transparency. In fact, the ad library remains limited and full of technical glitches.
Mark Zuckerberg`s additional personal pledge to provide improved access for researchers to study the disinformation campaigns on his platform remains mostly a promise.
External experts and academics hit a brick wall and even donor foundations are reconsidering their involvement due to limited cooperation on behalf of Facebook.
This doesn't mean that Twitter or YouTube are off the hook.
A recent research investigation showed that all the tech platforms have made more than 125 public announcements in the past three years describing how they would solve online manipulation through self-regulation.
The results? Changes of their own opaque algorithms have proved "inadequate to curb the spread of low-quality content". And when it comes to organic content - their algorithms still make sensationalist or polarising posts more visible in news feeds even if they are not sponsored.
Self-regulation doesn't deliver
There are a number of ugly truths. The first is that self-regulation for digital companies doesn't work.
In the new mandate, the commission should use the upcoming review of the code of practice in 2020 and propose binding legislation that makes digital companies liable for particular content on their platforms.
Facebook and the rest should be called out that buggy ad libraries or limited transparency are just a fig leaf. Tech companies should be obliged to allow qualified researchers to have proper access to their data in order to study disinformation patterns.
The second truth is that neither European nor national institutions are taking ambitious steps.
The EU`s East StratCom allocated funding to address Russia's disinformation was just over €1m in 2018. This is so small that it can be considered a rounding error in the EU`s total budget.
The notorious disregard of EU foreign affairs chief Federica Mogherini to the work of this strategic team is also a major blunder.
Sadly, national capitals have shown little enthusiasm for effective cooperation. The specially established Rapid Alert System for synchronising member state response on disinformation is in a deplorable state.
Organising conferences or sharing best practice manuals just won't do. Additional financial and institutional resources have to be urgently deployed. Moreover, governments should have a wider toolkit for direct response to hybrid threats and allow for adequate countermeasures against foreign actors.
Lastly, European politicians might publicly bash digital platforms but continue to spend like crazy on digital advertisements during campaigns.
National electoral laws are stuck in the 20th century and digital ads roam free in an unregulated space where transparency and accountability are almost nonexistent.
The UK electoral commission, for example, has been advocating on this issue for more than a decade.
National electoral rules should reflect the new realities by setting clear rules and corresponding penalties, especially during election campaigns.
Don`t get me wrong. The EU has been leading a laudable effort to pioneer measures against disinformation. But they remain either soft or just don't deliver results. Much more is needed when democratic integrity and societal unity is at stake.
European and national institutions should stop barking at the disinfo caravan.
They need to bite.
Author bio
Dimitar Lilkov is a researcher at the Wilfried Martens Centre for European Studies.
Disclaimer
The views expressed in this opinion piece are the author's, not those of EUobserver.
Site Section
Related stories
- No large-scale disinformation detected in EU this year
- 'Russian sources' targeted EU elections with disinformation
- Six takeaways on digital disinformation at EU elections
- Online platforms need regulating, Jourova warns
- EU warns tech giants 'time to go beyond self-reguation'
- EU Commission plans sanctions on disinformation