How Russia is attacking the European Union with disinformation

“Disinformation, unfortunately, is not illegal. Therefore, we must do everything possible to make its dissemination more expensive and difficult. Because we know that Russia will continue to spread it for as long as possible,” says Martyna Bildziukiewicz, the head of the East StratCom Task Force, in an interview with Maciejowi Makulskiemu.

In 2018, the European Union adopted the Action Plan against Disinformation, which enabled a more coordinated and consistent response to this threat. How do you assess the development of EU actions against hostile steps by Russia and other countries on the internet?

We can consider it a landmark moment nearly five years after the adoption of the Plan. It is the first document in history where all major European institutions declared the need to work together to overcome the challenge of disinformation. I remember very well how it all started. It was my third day as a member of the East StratCom Task Force when the document was adopted. Its significance cannot be overstated; it defines the work of my team and others who contribute to the fight against disinformation in the EU. It was also the first official document that stated the need for close cooperation with social media platforms. It was a time when the EU recognized the threat. It is quite telling that such a decision was made just a few months before the 2019 European Parliament elections.

Did the Action Plan influence the work of your team?

An important element of the Plan is the commitment of the European Union to hire 27 local strategic communication specialists in different parts of the world. Their task is to assist EU representations in understanding the impact of disinformation in various countries and what can be done at the local level to effectively counter the threats. Currently, they are working in three different regions: the Eastern Partnership countries, the Western Balkans, and the countries of the Middle East and North Africa. Typically, they come from the country or region where their activities are focused. This is important because they are familiar with the language and understand the local context.

Have there been any other significant steps taken at the EU level following the adoption of the Action Plan?

There have been other important documents worth mentioning. One of them is the European Democracy Action Plan advocated by the Chief European Commissioner, Věra Jourová. It addresses, among other things, the issue of disinformation in the context of elections and freedom of expression. Its relevance became crucial when Russia attacked Ukraine last year. Additionally, during the pandemic, the European Commission and the European External Action Service recognized the risks of disinformation related to COVID-19 and its impact on public health. We witnessed this on a large scale during the first year of the pandemic.

How did social media platforms react?

Here we come to another breakthrough, the Digital Services Act, adopted in 2022. It introduces a range of new rules for social media platforms, encouraging them to take more responsibility for addressing the issue of information manipulation. Additionally, the Code of Practice, which was adopted a few months after the implementation of the Action Plan, is worth mentioning. It represents a form of cooperation between the European Commission and social media platforms, based on voluntary commitments. Based on the Code, they regularly report on their efforts to combat disinformation. So, what was once difficult to imagine is now happening and has inspired similar initiatives in other regions of the world.

How many platforms have committed to adhering to the principles of the Code?

All major industry players have joined it. I think one of the challenges has been that we, as representatives of the EU as a regulator, use a different language than the world’s largest corporations. When a platform like Google informs us that they have detected instances of disinformation, it may carry a different meaning than what Facebook tells us about the disinformation campaigns it has encountered and the behavior it defines as coordinated and inauthentic.

Therefore, one of the key conclusions during the initial stage of cooperation was that we need to use a common language. We need a set of terms that we agree upon and use in the same way.


See also: Orban’s Ministry of Truth. How do anti-Ukrainian and anti-European propaganda work in Hungary?


Has your team found a solution?

My colleagues from the European External Action Service proposed replacing the term “disinformation” with “Foreign Information Manipulation and Interference” (FIMI). More and more people and institutions are starting to use this term. I believe that using it is better than defining and describing the specifics of the problem every time we encounter it.

What is the difference between FIMI and disinformation?

There are various actors, including states, that employ different methods and means of manipulation. Therefore, we need a more comprehensive term, broader than “disinformation,” which can be part of FIMI. We understand that it may sound like something only the EU institution can come up with and that we are adding another abbreviation. However, we deemed it necessary because we often deal with words that cause pain to people. So, if we don’t speak the same language, we won’t be able to take any measures.

It seems that legal norms for combating FIMI have already been established. What should be the next steps?

After Russia’s attack on Ukraine last year, sanctions were introduced for disinformation and information manipulation. This was something that had never been done before, but we have seen how Russia repeatedly used the entire ecosystem of disinformation and information manipulation to justify its aggression and war crimes. That’s why we initially received decisions from the European Commission and the European External Action Service, which later led to the adoption of this concept by the European Council. All member states unanimously supported the decision to block RT (formerly Russia Today) and Sputnik. These are the two largest propaganda operations of Russia outside its borders. Subsequently, sanctions were imposed on seven other media outlets and dozens of individuals directly linked to the Kremlin’s disinformation ecosystem. These are significant changes that would have been unimaginable even a month before the invasion.

Why was it so difficult to impose sanctions before the invasion?

I remember many discussions on this topic, where the main argument was always freedom of speech, which means we cannot prohibit any source of information due to our values and principles. And indeed, we have and defend these values! Nobody disputes that. The point is that the Russian media outlets I mentioned are not media groups. They do not adhere to any journalistic standards. I don’t even like using the term “media” to describe them; I prefer to call them “drain tanks.” We finally realized that freedom of speech is also about protecting our information space from attempts to manipulate it.

What about the resources available at the EU level to combat disinformation? There is an opinion that the available financial and human resources are not enough. Has the development of the legislation we discussed earlier been accompanied by an increase in the budget and number of teams, such as the East StratCom Task Force?

We need to have more resources, but as long as we’re fighting against a behemoth, we will always need more. However, compared to 2018, my team has become much stronger. We have more funding and more people: thirteen individuals primarily focusing on Russia as the main actor in information manipulation, Eastern Partnership countries, and Central Asia. It may not be a large number, but when I started working here in December 2018 as a team member, I had only a few colleagues. We were part of the Strategic Communications Department, which was much smaller than it is now. Currently, the department has over forty individuals working on Russia, as well as China and other countries. In addition, there is another team that analyzes data, as well as units that work with specific regions that are not necessarily seen as subjects of disinformation, but as targets. We now have a global perspective and more tools at our disposal.

Are people aware of the growing threat of disinformation?

Awareness of the efforts by Russia and China to justify war has increased at the highest political level since last year. This also applies to the general public. We see it on all our online channels. Overall awareness has grown, but unfortunately, it took a pandemic and war for many people to realize that disinformation can literally be deadly.

How do you assess the impact of the work of the East StratCom Task Force?

That’s a million-dollar question. When you engage in strategic communications, you can look at whether you’ve gained followers and how often people react to what you want to convey to them. And here we see at least a tenfold increase in interest in our work compared to before Russia’s invasion of Ukraine. Measuring impact is also a challenge because we’re dealing with operations that affect the human mind. It means that we often deal with deeply entrenched beliefs that are difficult to study. We try to conduct surveys in Eastern Partnership countries that help us test certain disinformation content and evaluate whether people believe in it and why. This at least allows us to assess whether disinformation works and whether the efforts made by the community fighting it have any effect.

How to respond to disinformation? Visiting the East StratCom Task Force website, one gets the impression that fact-checking is the main tool. Unfortunately, reliable information never reaches the same audience as manipulations. That’s why we need other tools to increase the effectiveness of the fight against disinformation.

There will never be a one-size-fits-all solution, so we must utilize a wide range of tools. Only then our work makes sense. It is also important to consider who is using these tools. We can talk about all the people who exist in the information space and are engaged in some kind of work. This means that we need to think about the information we consume and whether we need to consume so much of it or verify the sources of information. The verification community and journalists constantly remind us of this.


See also: The Compatriots: How Russian organizations in Europe work for the propaganda and foreign intelligence of the Russian Federation


Everyone has a role to play from individuals to civil society, governments, and the international level. This also applies to the private sector and social media. From this perspective, fact-checking is one of the key tools, and I agree that there are many challenges in this field. Verified information will never be as popular as a manipulated narrative. Moreover, the work of fact-checkers is not widely appreciated. A study conducted in Poland before the Russian invasion of Ukraine revealed a discrepancy between the percentage of those who said, “I support fact-checkers and understand the importance of their work” and the percentage of those who could name at least one fact-checking organization. Therefore, I believe that much more needs to be done to popularize this type of work.

In addition, we have a database of disinformation messages that we have collected over the past eight years. With its help, we can identify when Russia has used certain narratives, for example, about Ukrainians as Nazis, and show that this is a recurring element of Russian tactics.

Is the goal to create a publicly accessible tool that will make everyone a debunker?

Yes, this tool should be available in different languages. Whenever someone notices something in their media that raises suspicion, they can use our tools, as well as tools from other organizations, to fact-check it and determine what is factual.

Building resilience is the most important part of countering disinformation. This means investing in the ability to react quickly. In this regard, we can rely on fact-checkers to assist us. However, we must constantly stay updated on new tactics and methods that Russia and others employ to manipulate us.

How can we enhance this resilience?

By resilience, I mean, for example, the training we provide to journalists and representatives of civil society organizations. We equip them with tools for analyzing intelligence and open-source data so that they can track disinformation narratives and understand how they spread. We see this as a long-term investment. We assume that when journalists and non-governmental organizations acquire such knowledge, they will pass it on to their communities, and the information will spread. As a result, we will all become more resilient.

And what can states and international organizations do?

One of the key tools available to states and international organizations is coordinated diplomatic response. The G7 has made statements calling on Russia to cease its disinformation practices. Today, this may seem like a soft measure, but a few years ago, it was not the case. As a community of like-minded nations, we are sending a message: “We see you, Russia, and we know how to combat your methods.”

I also want to mention the latest form of our response: obstacles, with sanctions being a prime example. We aim to make disinformation as difficult as possible.

Why is it so difficult to fight disinformation?

Most practices used by Russia and other actors to manipulate users do not conflict with the law. Disinformation is not illegal. Therefore, we must do everything possible to make it more expensive and difficult to spread. After all, we know that Russia will spread it as long as possible.

How do you see the next steps in strengthening the response to disinformation?

The most logical step is to strengthen current operations. This process has already begun, and we need to focus on its implementation rather than inventing something new. I know it may not sound very exciting, especially when talking to media representatives who expect flashy things. That’s entirely understandable. But I’m not opposed to transitioning to a more mundane phase where we can see how different tools are working and continue to improve them.

Originally posted by Nowa Europa Wschodnia. Translated and edited by the UaPosition – Ukrainian news and analytics website


See also: Foreign voices of Russian propaganda: who and how justifies Putin in the world


Avatar photo

UaPosition

An independent media focused on Ukraine.
Follow us on social media:
FacebookTwitterInstagram

Submit a Comment

Your email address will not be published.

Share This

Share this post with your friends!