Russia, Disinformation and Europe’s Battle for Truth

Eleni Kallea | 16 May 2024
No image

The Russian president, Vladimir Putin, says he agrees with Vladimir Lenin except on ‘self-determination’. That includes propaganda.

At the end of last month the European Commission launched investigations into Facebook and Instagram—both part of Mark Zuckerberg’s Meta—over their handling of disinformation stemming from foreign countries, particularly Russia. The move comes amid concerns about Russian interference ahead of the European Parliament elections in June.

The commission suspects the platforms of failing to limit the spread of falsehoods and co-ordinated foreign manipulation, potentially violating the Digital Services Act. Meta is accused of not adequately controlling deceptive advertisements, disinformation campaigns and co-ordinated bot farms. This investigation marks the fourth under the DSA, following actions against TikTok and other platforms.

Sophisticated campaigns

On March 2nd 2022, the European Union imposed sanctions on several Russian media outlets and holdings—including NTV, Rossiya 1, RT, Sputnik, National Media Group and VGTRK—suspending their transmission and broadcasting within the EU following the full-scale invasion of Ukraine. The EU justified these as necessary to protect public order and security: this was not censorship, it said, but comprised targeted measures consistent with fundamental rights and freedoms. The legal basis was in the Treaty on European Union (article 29) and the Treaty on the Functioning of the European Union (article 215), laying out the general framework for EU sanctions.

Numerous reports and analyses indicate that Russian entities, including state-sponsored actors, have indeed been involved in spreading disinformation. This has been observed not only in the context of the war with Ukraine but also in other geopolitical situations and in attempts to influence elections and public opinion in various countries.

The Russian government has been accused of operating sophisticated campaigns aimed at shaping narratives, sowing discord and undermining trust in democratic institutions. These often involve the use of ‘social media’ platforms, fake news websites and other online channels to spread false or distorted information and conspiracy theories.

Variety of tactics

Russia employs a variety of tactics to spread disinformation in the EU and beyond. These include:—

•propaganda: state-controlled media outlets, such as RT (formerly Russia Today) and Sputnik, promote pro-Russian narratives and advance the Kremlin’s agenda;

•Doppelgänger: ‘clones’ of at least 17 authentic media outlets—including Bild (Germany), 20minutes (France), ANSA (Italy), the Guardian (Britain) and RBC-Ukraine—target users with fake articles, using domain names similar to those of the imitated media and copying their designs;

•sleeper sites: websites slowly build an audience via unrelated posts before switching to disinformation at an appointed time;

•‘social media’ manipulation: platforms are used to amplify disinformation through co-ordinated campaigns, false accounts and automated bots, and

•cyberattacks and hacking: allied to data breaches to steal information, these disrupt communications and undermine democratic processes.

These tactics are often employed in combination to create campaigns aimed at achieving strategic objectives, such as weakening the EU’s cohesion, undermining trust in institutions and advancing Russia’s geopolitical interests.

Rebutting the attacks

What tools does the EU have to rebut these attacks? The EU vs Disinfo initiative, launched by the European External Action Service in 2015, aims to counter disinformation targeting the union. It offers a searchable database of more than 12,000 samples of pro-Kremlin disinformation.

In 2018, the commission introduced a voluntary Code of Practice on Disinformation, which encourages online platforms to take measures to combat its spread. In June 2022, a strengthened code was signed by 34 signatories which had been party to its revision, including major technology companies such as Meta, Google and Tiktok.

The European Digital Media Observatory, established in 2021, is a network of fact-checkers, researchers and academic institutions working to monitor and analyse disinformation trends in Europe. The EU also operates a Rapid Alert System, which enables member states to share information about disinformation incidents and co-ordinate responses.

The EU provides funding for research and innovation projects aimed at countering disinformation and enhancing media literacy. This includes projects under the Horizon 2020 and Horizon Europe programmes, as well as initiatives such as the European Innovation Council pilot.

Finally, the EU has legislated in this arena, with the DSA and its companion Digital Markets Act. These aim to establish clear rules for online platforms, enhance transparency and accountability, and strengthen users’ rights in the digital environment.

Informed discourse

Citizens too, though, can play their part. By supporting independent journalism and diverse media ecosystems, we can reduce the influence of disinformation. By exploring a range of perspectives and sources, we are less likely to be exposed to echo chambers and filter bubbles.

Encouraging critical-thinking skills from an early age can help young people become more discerning consumers of information—teaching the youth to question assumptions, consider multiple viewpoints and seek out reliable sources can help secure them against disinformation. Finally, by actively engaging in democratic processes, such as voting, community organising and political activism, we  can help counter disinformation by promoting informed civic discourse and democratic values.

Eleni Kallea is a Brussels-based EU affairs consultant and communications specialist, currently working as a liaison officer for DG COMM at the European Commission. She holds an MA in translation and an MSc in information and communication systems engineering.

This article was originally published on Social Europe.
Views in this article are author’s own and do not necessarily reflect CGS policy.  



Comments