From Eastern Flank to Western Elections: Russian Operations Against the EU and NATO

While Russian influence operations have a history that goes far beyond the concept of hybrid warfare, they have seldom gotten such attention. This chapter investigates the evolution of Russian hybrid means and the ends to which they are applied. The Russian leadership’s best bet against the collective West is currently in these operations. The core of countering them lies in changing the Russian cost-benefit analysis that suggests, so far, that conducting these operations holds great rewards and carries fairly small risks.

1. Russian Goals and the Limits of Russian Power

At their foundations, the European Union (EU) and NATO are based on a political agreement that acknowledges mutual interests, security, and support. This is the focus of the conflict between Russia and the West (here shorthand for the EU and NATO). Russia stands little chance of prevailing in this conflict in the face of a committed and united West. Conversely, it has a great opportunity to succeed against a divided West. Unfortunately, many divisions exist that can be exploited. Fragmenting the West’s political unity lies at the core of Russia’s strategy as it seeks to promote economic ties with some European states — Germany and France — while isolating and provoking others. This paper focuses on Russian influence operations against both the EU and NATO as well as individual member states.

The Russian leadership’s ability to achieve its goals — regime security and great power status through weakening the EU and NATO — comes from its power of destruction rather than its power of attraction. Russia has few true allies in the world. Even China, which shares an increasingly close relationship with Russia, cannot be considered an ally.1

The United States and Europe have seen increasing political polarization and decreasing trust in democracy over the past few decades.2 Russia has sought to exploit this fact through its support for far-right and populist actors and movements that are united in their opposition to the EU, NATO, or simply “the establishment.”3 This enables Russia to have a destructive influence over processes that are already contentious.

Elections and referenda, which, in essence, are processes to settle political contention, are particularly vulnerable. The core idea of popular votes is that even if one’s own side does not win, one will accept the outcome as the process was free and fair. There is, therefore, an incentive for Russia to influence the outcome of these processes to sow doubt and create instability.4 The European Parliament accurately summarized the goal of Russian cyber operations targeting the EU as: “distorting truths, provoking doubt, dividing Member States, engineering a strategic split between the European Union and its North American partners and paralyzing the decision-making process, discrediting the EU institutions and transatlantic partnerships … undermining and eroding the European narrative based on democratic values, human rights and the rule of law.”5

2. Hammers: Russia’s Hybrid Tools

While hybrid warfare is naturally a wide effort, the Russian way goes beyond a Western all-of-government approach to include organized crime, cyber privateers, and intelligence services with a global reach and impressive coordination. According to one estimate, there are at least six presidential administration departments and a series of presidential councils in Russia that are involved in the active measures campaign.6 This shows both the variety of Russia’s hybrid warfare and its impressive machinery for exercising control. It poses a particular challenge for the EU and NATO, which are carefully bound by their respective mandates divided into different domains. Moreover, different sectors within the EU and NATO have a hard time cooperating even in the best of circumstances, not to mention against an actor that uses everything from licit and illicit finance, hackers, media outlets, and intelligence services.

The information domain is the most important venue for hybrid warfare. The revolution in information and communications technology has been one of the most profound societal changes in a long time. Today, a large part of how we understand the world, power, and legitimacy is mediated through social media.7 Therefore, “the process of collecting and organizing information is now a tremendous source of economic, political and cultural power.”8 This shift is, naturally, no secret to Russian strategists who have done their utmost to update their disinformation toolbox.

Russia’s vulnerabilities in the information domain have been exposed on several occasions in the past. For example, Chechen separatists successfully used the internet in a propaganda war with the Russians during the Second Chechen War, Russia’s image took a beating in the global media when it invaded Georgia in August 2008, and the Russian leadership was caught unaware by the pro-democracy Arab Spring in the Middle East and North Africa and the massive anti-government protests these uprisings inspired following the Russian elections in 2011. However, each failure was followed by adaptation and innovation: after the Chechen wars, Russia increased internet restrictions and surveillance; after the Georgian war, Russia’s state-controlled television network, RT, extended its global reach to include Arabic, Spanish, and French audiences; and, after the Arab Spring, Russia expanded censorship of social media9 (such as a new treason law that targets human rights activists and limits freedom on the internet).10

Russia’s countermeasures were not limited to defense. Its offensive toolbox was enhanced as well. Following the Arab Spring and the concomitant protests in Russia, the first reports referring to the Internet Research Agency (IRA), the St. Petersburg-based troll farm, emerged.11 The Russian leadership used the IRA to conduct an offensive against its domestic opponents (e.g., Russian opposition leader Alexei Navalny as early as 2013), but also international opponents (e.g., the United States’ 2016 presidential election).12

While targeting the United States, Russia’s strategy included running fake social media accounts pretending to be everything from alt-right voices to Black Lives Matter activists. The goal was to increase polarization and violence. Evidence of Russian meddling in the 2020 U.S. elections is now also emerging. A number of reports have described the use of fake Instagram accounts to discredit then U.S. presidential candidate Joseph R. Biden, Jr.13

Moreover, the Russian disinformation machinery has sought to amplify the voices of QAnon, a conspiracy theory collective that believes that former U.S. President Donald J. Trump is the guardian against a coup and that Hillary Clinton and her allies are running sex-trafficking rings.14

The Russian disinformation machine often maintains a high degree of coherence across channels and regions in terms of key messages, but not always in delivery. This is to a large degree the result of coordination from the top by Dmitry Peskov, Russian President Vladimir Putin’s press secretary. Peskov’s weekly meetings with representatives of pro-Kremlin media outlets are combined with guidelines for social media farms and foreign embassies.15 This is what allows for a high degree of coherence (although never flawless) between the different arms of the Kremlin machinery.

Another less noticed, but no less effective, way of pushing Russian narratives against the EU and NATO is evident in the Western Balkans. The Western Balkans are today at the front line between the EU and NATO on the one hand and Russia on the other. The states of the Western Balkans are on a steady, but slow, path to integration with the EU and NATO. Up until 2014, the Russian leadership did not seem to care too much about this, but after Russia invaded Ukraine that year its ambitions grew in the region. Montenegro, in particular, concluded association negotiations with NATO in May 2016 and joined the Alliance in June 2017. Coincidentally, Montenegro experienced an increase in cyberattacks both in terms of sophistication, but also in numbers, from 22 in 2012 to more than 400 in 2017.16

Russian leaders wanted to make an example of Montenegro for states pondering NATO membership. Around the time of the Montenegrin parliamentary elections on October 16, 2016, pro-NATO and pro-EU political parties as well as civil society groups and electoral monitors were targeted by large-scale distributed denial-of-service (DDoS) attacks. The cyberattacks were traced to APT28, also known as Fancy Bear, a hacking group with ties to Russia’s military intelligence service, GRU.17 There was also a coup attempt ahead of the elections that sought to topple the government and assassinate then-Prime Minister Milo Đukanović. The coup plotters were identified as GRU officers Eduard Shirokov (nom de guerre Shishmakov) and Vladimir Popov.18 They were indicted in 2017 along with 12 other people with Russian, Serbian, and Montenegrin citizenship.19

The cyberattacks, intelligence operations, and subversion in Montenegro should be seen in conjunction with Russia’s larger information offensive against the Western Balkans. A key instrument in that offensive has been Sputnik Serbia (Srbija), which has focused on providing pro-Russian, anti-EU, and anti-NATO narratives. Sputnik has been successful as it allows for free reproduction of its articles, which results in these articles being widely published by outlets with few resources.20 In other words, Russian disinformation is successful not so much because of illegal methods, but rather because it exploits opportunities presented by the structural transformation, or crisis, in the media.

It is hard to assess the aggregated impact of this disinformation in the international domain, but some examples can be illustrating. For example, 42% of Serbians see Russia as their best partner and 14% the EU. This is the case even while Russian trade and aid to Serbia is just a fraction of that of the EU.21

The case of Montenegro provides a vivid example of the combination of offline and online tools used by Russia, and the Russian leadership’s broader desire to undermine NATO and EU membership. By themselves, the cyberattacks or the information efforts might seem like minor nuisances, but the combination of these different tools is what makes them potent and gives them synergies.

As social media companies and governments get better at handling the crudest form of information influence, the Russian tactics have evolved. In Sweden, Sputnik published in 2015 a Swedish edition, but gave up after nine months because of low readership and influence.22 That did not, however, stop the information offensive, but rather Russia updated its methods and targeted existing Swedish media. One example is how the Nordic’s largest newspaper, Aftonbladet, published a story targeting a Swedish researcher as a member of British intelligence. The story was created from a (seemingly) GRU hack of the British Institute for Statecraft and then reported by Sputnik, RIA Novosti, and RT, after which Aftonbladet picked it up.23 The story was clearly false and has been reprehended by the Swedish Press Ethical Committee. Influence operations through local and national media give a strong air of legitimacy that a Sputnik publication cannot have. Local and national media are, unfortunately, a fairly easy target.

Another shift is Russia’s increasing use of fake portals in the information domain. In 2020, the IRA set up a fake left-wing news publication, PeaceData, staffed it with fake editors, and hired unwitting but legitimate freelance journalists. The angle of the reporting was “anti-war” and “abuse of power,” and it focused on “appealing to left-wing voters and steering them away from the campaign of Democratic presidential candidate Joe Biden.”24 PeaceData contributors were asked to recruit more writers from among their contacts, thus increasing the perceived legitimacy of the operation. The scheme was exposed before it could build a significant following (only 14,000 followers on Twitter), but it is a good example of innovation and adaptation. This example also underlines how “the old Soviet technique of infiltrating authentic social groups is being updated for the 21st century, obscuring the difference between real debate and external manipulation.”25

Russia took a similar approach in France where two online portals — OneWorld.Press and ObservateurContinental.fr — spread disinformation surrounding the Covid-19 pandemic. The latter, for instance, alleged that NATO’s Defender-Europe 20 exercises were to blame for the outbreak of Covid-19 in Europe. Both the portals had connections to InfoRos, which has ties to the GRU and, among other things, is physically located in the Russkiy Mir Foundation, a Russian-government funded instrument of soft power.26

The Russian operation in France mirrored a larger pattern of using the pandemic to spread disinformation about NATO. The pandemic created an urgent need for rapid information, which provided a fertile ground for disinformation campaigns. NATO detailed how it had detected coordinated disinformation campaigns against the presence of its troops in Latvia, Lithuania, and Poland. These campaigns included a fake letter, purportedly from NATO Secretary General Jens Stoltenberg to Lithuanian Defense Minister Raimundas Karoblis, stating that NATO was withdrawing its troops from Lithuania, and a fake interview that claimed Canadian troops had brought Covid-19 to Latvia.27 NATO said Russian state-controlled media — Sputnik and RT — were instrumental in spreading this disinformation. Sputnik alleged that the coronavirus was being developed in U.S./NATO labs.28 In each of the campaigns, NATO identified common techniques: forgeries, fake personas, falsehoods, amplification in fringe pro-Russian websites, and “language leap,” where fabricated content leaps from its original source to English-language media.28

The cyber domain is another area which holds a lot of promise for a revisionist power like Russia to offset a more prosperous and superior West. Russia is skilled at combining the use of different domains of influence. It is a sophisticated cyber actor that has the “full range of capabilities for undertaking actions in cyberspace …. It implements a very advanced offensive program.”29 These capabilities were shown in Russia’s interference in the 2016 U.S. presidential election and are a reminder that the most significant impact of its operations was not through fake social media accounts, but the hack-and-leak operation against the Democratic National Committee (DNC), which started with a cyber intrusion.30 U.S. intelligence agencies concluded that the DNC hack sought “to undermine public faith in the US democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency. We further assess Putin and the Russian Government developed a clear preference for President-elect Trump.”31

Following the exposure of its meddling, including by the Mueller report, Russia improved its modus operandi with increased operational security and more stealthy operations.32

This included “a partly successful attempt to interfere, via hack-and-leak, in the French presidential elections of 2017 and almost certainly in the United Kingdom in 2019.”28

Coalition Warrior Interoperability eXercise (CWIX) 2020. Credit: NATO
Coalition Warrior Interoperability eXercise (CWIX) 2020. Credit: NATO

3. Impact of Russian Influence Operations

The conduct of modern conflict is constantly developing with organizational and technological innovation, and through interactions with the participants. Russia’s leadership has notably had a more difficult task as the world has become more aware of its nonmilitary influence since the 2016 U.S. presidential election. The big technology companies — Facebook, Twitter, and Google — also seem to have woken up to the fact that they are key arenas for the information conflict, and have started to take countermeasures. On the other hand, hybrid warfare still favors the attacker as few costs are imposed. Moreover, the conditions for such warfare were particularly favorable during the Trump presidency and the higher demand for information in the pandemic.

It is critical to put Russian influence operations in perspective. It is incorrect to dismiss them as unsuccessful simply because both NATO and the EU are intact, or because many of the posts coming from the IRA have low viewership and low levels of interaction. In fact, Russia, starting from a limited power position, is attempting to impact the world’s most powerful political union, the EU, and the world’s most powerful military alliance, NATO, by using relatively cheap means.

Moreover, the aggregate impact of flooding the media with fake news is often bigger and more important than interactions with individual pieces of content. This can be seen in a study of the media landscape in Michigan in the lead up to the 2016 U.S. presidential election. It found that sensational and conspiratorial material as well as fake news was shared a lot more on social media than well-researched news, the proportion of well-researched news being shared was the lowest ever the day before the election, and that Trump-related hashtags by far surpassed Clinton-related ones.33

That being said, Russian disinformation attempts can hardly be so effective as Trump himself stating that it would be the most fraudulent election ever.34 Nonetheless, it is lazy analysis to simply state that Trump was more harmful than foreign meddling as that is a false dichotomy. More correctly, the most effective influence operations have always been about exploiting existing divisions and local actors. For example, Trump described Montenegro as “very aggressive” and said that defending it would lead to World War III.35 The question is, where did those views originate? It is hard to believe that they originated from U.S. intelligence briefs. That narrative was only being pushed by the Russian disinformation machinery. This illustrates the fact that the impact of disinformation cannot merely be measured through Facebook interactions, it can also be seen in a president’s comments that draw either directly from Russian sources or, more likely, from sources that are susceptible to Russian disinformation.

Even if Russian influence operations in 2016 did not sway a single voter, they sharpened the polarization in U.S. politics and society. This was manifested in the country being tied up for years in debates on the extent of collusion by the Trump campaign with the Russians and impeachment procedures.

The threat from Russian influence operations remains real, even though they are, by their very nature, unlikely to have an immediate or obvious impact. The EU and NATO are dependent on political support from their member states. In these states, there is some opposition to both institutions that can be amplified and exploited by malign Russian actors.

4. Lessons for the EU and NATO

Even before Election Day in the United States in 2016 it was clear to the Obama administration that Russia was trying to meddle in the outcome. When then-U.S. President Barack Obama met Putin he told him to “cut it out,” but the Russian leadership did not.36 In other words, U.S. deterrence failed. Obama was unsuccessful in conveying a credible “or else” to Putin. Similarly, French President Emmanuel Macron called Sputnik and RT “propaganda machines” to Putin’s face at Versailles in 2017, but that, too, did little to stop Russian disinformation operations.

There has been much discussion about the ambiguity of disinformation operations. Much of it is exaggerated. Attribution is possible, albeit not immediate. U.S. sanctions that followed Russian election interference and other operations have targeted individual and low-level GRU operators, with their activities and duties clearly outlined.37 This goes to show that attribution is not a major challenge. In fact, U.S. intelligence agencies have a good sense of who is doing what, but they have failed when it comes to deterrence.

The most immediate lesson for Western governments from the U.S. elections in 2016 was not to be quiet about Russian influence operations. The biggest benefit of exposing Russian operations is increasing public awareness of the threat and the determination to devote sufficient time and resources to countering it, which will in the long run change the cost-benefit calculus.38 Moreover, Bellingcat’s investigative journalism has served to expose Russian intelligence operations and has become a headache and source of embarrassment for the Russian leadership.

Nonetheless, “naming and shaming” should not be seen as sufficient for deterring Russian operations. After the U.S. elections in 2016, the poisoning of former Russian-British double agent Sergei Skripal in the United Kingdom in 2018, and the poisoning of Navalny in August 2020, it has become clear that the Russian leadership is not too worried about some of its high-profile operations becoming known to the public. On the contrary, the Skripal poisoning was intended to send a signal to other intelligence officers in Russia and also to the West. As “Putin and his inner circle appear to believe that they are in nothing less than a political war, [naming and shaming] will at best influence tactics, not strategy.”39

With the Russian leadership committed to the idea that it is in a political war against the EU and NATO, more is needed than simply exposing its malign behaviors. So far, the Western approach has been to primarily rely on sanctions in lieu of stronger policy measures. Sanctions are an alternative to escalation. They satisfy the urge to “do something” rather than fix the underlying problem.40 Moreover, inflicting economic pain is only effective to the extent that economic development is a priority for the Russian leadership. Nonetheless, it is demonstrably subordinate to regime security and great-power status.

The EU is responsible for the political response to the challenge from Russia, including sanctions, and has since 2014 taken a wide range of measures to increase preparedness against hybrid threats. These include creating sectoral strategies, establishing expert bodies (Hybrid Fusion Cell, Center of Excellence for Hybrid Threats), creating information-sharing mechanisms, conducting exercises and simulations, partnering with NATO, and increasing investments in cyber defense.41 Most notably, the EU adopted an Action Plan against Disinformation42 and set up an EU versus Disinformation initiative in 2015. In July 2020, the EU also imposed its first-ever sanctions (asset freeze and travel ban) in response to cyberattacks on individual GRU officers and the responsible center at the GRU.43

These are all important steps to improve the infrastructure, but the core problem for the EU and NATO is still political and about unity. Both the EU and NATO have viewed Russian hybrid warfare as more of a nuisance than a fundamental challenge. NATO is primarily responsible for the military instrument, but also has a key role maintaining political unity. The lack thereof was evident when Macron called for a rapprochement with Russia in 2019 while failing to grasp the fact that Russian aggression is premised on the predictability of the West to always return to the negotiating table even though the fundamental problem has not been addressed.

Indeed, between Russia’s invasion of Ukraine and Macron’s call for better relations, Russia had not only impacted the U.S. and French elections, it had also used chemical weapons on NATO soil to try and assassinate Skripal, an attempt that resulted in the death of a British citizen.44 Western actions have underlined that the West is unwilling to accept economic pain for geopolitical gain, in a failure to invest more into military and nonmilitary capabilities or impose tougher sanctions.

Governments alone cannot solve the problem of Russian influence operations. Big technology companies provide an important arena for these operations. These firms have come a long way since their 2016 laissez-faire approach to beefing up their defenses. Facebook is now more aggressive about taking down coordinated inauthentic behavior, Twitter has banned all political advertising, and Google, Facebook, and Twitter have signed onto the EU’s Code of Practice, which sets a wide range of commitments, including transparency in political advertising and the closure of fake accounts. However, as methods for exposing disinformation are disclosed, Russian strategists will seek to circumvent them.45 The task for the EU and NATO is Sisyphean.

Just days before the 2020 U.S. elections, the New York Post ran a story based on leaked (or fake) information against Biden. Twitter was quick to block the story, and Facebook posted warning labels next to it.46 Regardless of the wisdom of those actions, it does show the increased awareness of the big social media companies that staying away from acting is not a strategy.

Russia is constantly adapting its hybrid warfare in response to its adversaries’ actions and technological change. As automated bots and hack-and-leak operations are exposed, Russian operations have changed to create more organic-looking means of influence that blend international and domestic issues.

The key lesson for Russian strategists so far has been that their operations carry low costs and have potentially very high rewards. As long as this calculus remains in place, these operations will continue. The political unity of the West is fragile and already under great domestic strain — a reality that Russia seeks to amplify and exploit. The fundamental challenge for the West is maintaining political unity to counter Russian operations and successfully deter the most significant ones, including election meddling or the use of chemical weapons on NATO territory.

Many other operations, however, such as run-of-the-mill disinformation, cannot reasonably be deterred given the Russian leadership’s conviction that it is in a political war with the West. Such operations will need to be countered with hardened defenses, public-private cooperation, and dedication.

  1. Kofman, Michael. 2020. “The Emperors League: Understanding Sino-Russian Defense Cooperation.” War on the Rocks, August, 6, 2020, https://warontherocks.com/2020/08/the-emperors-league-understanding-sino-russian-defense-cooperation. []
  2. Foa, Roberto Stefan, and Mounk, Yascha. 2017. “The Signs of Deconsolidation.” Journal of Democracy, January, 2017, vol. 28, no. 1: 5-16. []
  3. Shekhovtsov, Anton. 2017. “Foreign Politicians’ Visit to Crimea Is Russia’s Latest Disinformation Failure.” The Moscow Times, March, 29, 2017, https://www.themoscowtimes.com/2017/03/29/foreign-politicians-visit-to-crimea-is-russias-latest-disinformation-failure-a57569. []
  4. Limnell, Jarno. 2018. “Russian Cyber Activities in the EU.” In “Hacks, leaks and disruptions: Russian cyber strategies” edited by Popescu, Nicu, and Secrieru, Stanislav, The European Union Institute for Security Studies, October, 23, 2018, https://www.iss.europa.eu/content/hacks-leaks-and-disruptions-%E2%80%93-russian-cyber-strategies. []
  5. “European Parliament resolution of 23 November 2016 on EU strategic communication to counteract propaganda against it by third parties.” European Parliament, November, 23, 2016, https://www.europarl.europa.eu/doceo/document/TA-8-2016-0441_EN.html. []
  6. Galeotti, Mark. 2017. “Controlling Chaos: How Russia manages its political war in Europe.” European Council on Foreign Relations, September, 1, 2017, https://www.ecfr.eu/publications/summary/controlling_chaos_how_russia_manages_its_political_war_in_europe. []
  7. Jonsson, Oscar, Campanella, Edoardo, and Owen, Taylor. 2020. “The New Digital Domain. How the Pandemic Reshaped Geopolitics, the Social Contract and Technological Sovereignty.” Center for the Governance of Change, https://www.ie.edu/cgc/research/new-social-contract-digital-age/. []
  8. Smyth, Sara M. 2019. “The Facebook Conundrum: Is it Time to Usher in a New Era of Regulation for Big Tech?” International Journal of Cyber Criminology, vol. 13, no. 3: 578-595, https://www.cybercrimejournal.com/SmythVol13Issue2IJCC2019.pdf. []
  9. Soldatov, Andrei, and Borogan, Irina. 2015. The Red Web: The Kremlin’s War on the Internet, 149-74. Washington, DC: PublicAffairs. []
  10. “Russia: Internet Legislation Merits Greater Scrutiny Before Passage.” Human rights watch, July, 11, 2012, https://www.hrw.org/news/2012/07/11/russia-internet-legislation-merits-greater-scrutiny-passage. []
  11. Garmazhapova, Aleksandra. 2013. “Gde zhivyt trolli. I kto ix kormit (Where the trolls live. And who feeds them).” Novaya Gazeta, September, 9, 2013, http://novayagazeta.spb.ru/articles/8093/. []
  12. Mueller, Robert S. 2019. “Report On The Investigation Into Russian Interference In The 2016 Presidential Election.” U.S. Department of Justice, March, 2019, https://www.justice.gov/storage/report.pdf. []
  13. Francois, Camille, Nimmo, Ben, and Eib, C. Shawn. 2019. “Russian Accounts Posing as Americans on Instagram Targeted Both Sides of Polarizing Issues Ahead of the 2020 Election.” Graphika, October, 21, 2019, https://graphika.com/reports/copypasta/. []
  14. Menn, Joseph. 2020. “Russian-backed organizations amplifying QAnon conspiracy theories, researchers say.” Reuters, August, 24, 2020, https://www.reuters.com/article/us-usa-election-qanon-russia-idUSKBN25K13T. []
  15. Galeotti, Mark. 2019. Russian Political War: Moving Beyond the Hybrid. Abingdon: Routledge. []
  16. Tomovic, Dusica, and Zivanovic, Maja. 2018. “Russia’s Fancy Bear Hacks its Way into Montenegro.” Balkan Insight, March, 5, 2018, http://www.balkaninsight.com/en/article/russia-s-fancy-bear-hacks-its-way-into-montenegro-03-01-2018. []
  17. Hacquebord, Feike. 2018. “Update on Pawn Storm: New Targets and Politically Motivated Campaigns.” Trend Micro, January, 12, 2018, https://blog.trendmicro.com/trendlabs-security-intelligence/update-pawn-storm-new-targets-politically-motivated-campaigns. []
  18. Garcevic, Vesko. 2017. “Congressional Testimony to Committee on Senate Select Intelligence,” June, 28, 2017. https://www.intelligence.senate.gov/sites/default/files/documents/sfr-vgarcevic-062817b.pdf []
  19. Bechev, Dimitar. 2018. “The 2016 Coup Attempt in Montenegro: Is Russia’s Balkans Footprint Expanding?” Foreign Policy Research Institute, April, 2018, https://www.fpri.org/article/2018/04/the-2016-coup-attempt-in-montenegro-is-russias-balkans-footprint-expanding/. []
  20. Jonsson, Oscar. 2018. “The next front: the Western Balkans.” In “Hacks, leaks and disruptions: Russian cyber strategies” edited by Popescu, Nicu, and Secrieru, Stanislav, The European Union Institute for Security Studies, October, 23, 2018, https://www.iss.europa.eu/content/hacks-leaks-and-disruptions-%E2%80%93-russian-cyber-strategies. []
  21. “Moscow is regaining sway in the Balkans.” The Economist, February, 25, 2017, https://www.economist.com/news/europe/21717390-aid-warplanes-and-propaganda-convince-serbs-russia-their-friend-moscow-regaining-sway. []
  22. Kragh, Martin, and Åsberg, Sebastian. 2017. “Russia’s Strategy for Influence through Public Diplomacy and Active Measures: The Case of Sweden.” Journal of Strategic Studies, vol. 40, no. 6. []
  23. Kragh, Martin. 2020. “Martin Kragh är ett demokratiskt problem (Martin Kragh is a democratic problem).” Statsvetenskaplig Tidskrift, vol.122, no. 3: 419-447, https://journals.lub.lu.se/st/article/view/22141. []
  24. Wanless, Alicia, and Walters, Laura. 2020. “How Journalists Become an Unwitting Cog in the Influence Machine.” Carnegie Endowment for International Peace, October, 23, 2020, https://carnegieendowment.org/2020/10/13/how-journalists-become-unwitting-cog-in-influence-machine-pub-82923. []
  25. Polyakova, Alina, and Fried, Daniel. 2019. “Europe is starting to tackle disinformation. The US is lagging.” Washington Post, June, 17, 2019, https://www.washingtonpost.com/opinions/2019/06/17/europe-is-starting-tackle-disinformation-us-is-lagging/?utm_term=.ac9f08609112. []
  26. “How two information portals hide their ties to the Russian News Agency Inforos.” OSINT Investigation, EU DisinfoLab, June, 2020, https://www.disinfo.eu/wp-content/uploads/2020/06/20200615_How-two-information-portals-hide-their-ties-to-the-Russian-Press-Agency-Inforos.pdf. []
  27. “NATO’s approach to countering disinformation: a focus on COVID-19.” The North Atlantic Treaty Organization, July, 17, 2020, https://www.nato.int/cps/en/natohq/177273.htm#case. []
  28. Ibid. [] [] []
  29. Świątkowska, Joanna. 2020. “Offensive Actions in Cyberspace – A Factor in Shaping Geopolitical Order.” in Albrycht, I (ed) et al., Geopolitics of Emerging and Disruptive Technologies, Krakow: The Kosciuszko Institute. https://ik.org.pl/wp-content/uploads/geopolitics-of-emerging-and-disruptive-technologies-2020.pdf []
  30. Rid, Thomas. 2016. “How Russia Pulled Off the Biggest Election Hack in U.S. History.” Esquire, October, 20, 2016, https://www.esquire.com/news-politics/a49791/russian-dnc-emails-hacked/. []
  31. “Assessing Russian Activities and Intentions in Recent US Elections.” Office of the Director of National Intelligence, January, 6, 2017, https://www.dni.gov/files/documents/ICA_2017_01.pdf. []
  32. Rid, Thomas. 2020. “Insisting that the Hunter Biden laptop is fake is a trap. So is insisting that it’s real.” Washington Post, October, 24, 2020, https://www.washingtonpost.com/outlook/2020/10/24/hunter-biden-laptop-disinformation/. []
  33. Howard, Philip N., Bolsover, Gillian, Kollanyi, Bence, Bradshaw, Samantha, and Neudert, Lisa-Maria. 2017. “Junk News and Bots during the U.S. Election: What Were Michigan Voters Sharing Over Twitter?” The Project on Computational Propaganda, Oxford Internet Institute, March, 26, 2017, https://comprop.oii.ox.ac.uk/research/posts/junk-news-and-bots-during-the-u-s-election-what-were-michigan-voters-sharing-over-twitter/. []
  34. Spocchia, Gino. 2020. “Trump says 2020 will be ‘one of greatest, most fraudulent elections ever.” The Independent, October, 9, 2020, https://www.independent.co.uk/news/world/americas/us-election/trump-2020-election-fraud-biden-fox-hannity-mail-ballots-voter-id-b908650.html. []
  35. Sampathkumar, Mythili. 2018. “Trump says defending “aggressive” Montenegro as a NATO member ‘will lead to World War III.” The Independent, July, 18, 2018, https://www.independent.co.uk/news/world/americas/us-politics/donald-trump-nato-montenegro-world-war-mutual-defence-a8453446.html. []
  36. Nelson, Louis. 2016. “Obama says he told Putin to “cut it out” on Russia hacking.” Politico, December, 16, 2016,https://www.politico.com/story/2016/12/obama-putin-232754. []
  37. “Treasury Sanctions Russian Cyber Actors for Interference with the 2016 U.S. Elections and Malicious Cyber-Attacks.” US Department of Treasury, March, 15, 2018, https://home.treasury.gov/news/press-releases/sm0312. []
  38. Giles, Keir. 2017. “Countering Russian Information Operations in the Age of Social Media.” Council on Foreign Relations, November, 21, 2017,https://www.cfr.org/report/countering-russian-information-operations-age-social-media. []
  39. Galeotti, Mark. 2020. “The Navalny poisoning case through the hybrid warfare lens.” The European Centre of Excellence for Countering Hybrid Threats, October, 2020, https://www.hybridcoe.fi/wp-content/uploads/2020/10/202010_Hybrid-CoE-Paper4_Navalny-case-through-a-hybrid-lens.pdf. []
  40. Fishman, Edward. 2020. “Make Russia Sanctions Effective Again.” War on the Rocks, October, 23, 2020, https://warontherocks.com/2020/10/make-russia-sanctions-effective-again/. []
  41. Fiott, Daniel, and Parkes, Roderick. 2019. “Protecting Europe: EU’s response to hybrid threats.” European Union Institute for Security Studies, Chaillot Paper/151, April, 2019, https://css.ethz.ch/content/dam/ethz/special-interest/gess/cis/center-for-securities-studies/resources/docs/EUISS_CP_151.pdf. []
  42. “A Europe that Protects: The EU steps up action against disinformation.” European Commission, December, 5, 2018, https://ec.europa.eu/commission/presscorner/detail/en/IP_18_6647. []
  43. “EU imposes the first ever sanctions against cyber-attacks.” European Council, July, 30, 2020, https://www.consilium.europa.eu/en/press/press-releases/2020/07/30/eu-imposes-the-first-ever-sanctions-against-cyber-attacks/. []
  44. Dodd, Vikram, Morris, Steven, and Bannock, Caroline. 2018. “Novichok in Wiltshire death ‘highly likely’ from batch used on Skripals.” The Guardian, July, 9, 2018, https://www.theguardian.com/uk-news/2018/jul/09/novichok-wiltshire-death-dawn-sturgess-highly-likely-same-batch-used-on-skripals. []
  45. Polyakova, Alina. 2020. “The Kremlin’s Plot Against Democracy: How Russia Updated Its 2016 Playbook for 2020.” Foreign Affairs, September/October, 2020, https://www.foreignaffairs.com/articles/russian-federation/2020-08-11/putin-kremlins-plot-against-democracy?utm_medium=social. []
  46. Paul, Kari. 2020. “Facebook and Twitter restrict controversial New York Post story on Joe Biden.” The Guardian, October 14, 2020, https://www.theguardian.com/technology/2020/oct/14/facebook-twitter-new-york-post-hunter-biden. []