Combating influence operations is a major priority of governments, tech platforms, and civil society organizations around the world.1 Yet policymakers lack good information about the nature of the problem they seek to solve. Empirical research on how influence operations can affect people and societies—for example, by altering beliefs, changing voting behavior, or inspiring political violence—is limited and scattered. This makes it difficult for policymakers to prioritize influence threats, judge whether the problem is getting better or worse, and develop evidence-based solutions.
To assess what is known about the effects of influence operations and identify remaining research gaps, the Partnership for Countering Influence Operations sponsored a systematic literature review by Princeton University’s Empirical Studies of Conflict Project. Laura Courchesne, Jacob N. Shapiro, and Isra M. Thange examined eighty-two studies published between 1995 and 2020.2 The review included only those studies that (1) examined a specific population targeted by an influence operation, (2) compared measurable outcomes (behaviors or beliefs) of people exposed versus those who were not, and (3) met minimum standards of statistical credibility. The selected studies covered multiple forms of influence operations—mainly political disinformation, state propaganda, and health misinformation.
The literature demonstrates that certain kinds of influence operations can have measurable effects on people’s beliefs and behavior. But we still lack answers to fundamental questions, like whether and how social media–based operations differ from traditional forms of influence.
The strongest findings cluster in two areas: long-term exposure (lasting between four and thirty years) via traditional mass media, and short-term exposure (lasting days) via social media. Both types of exposure can measurably affect the beliefs or behavior of targeted populations.
Long-Term Mass Media Operations
Multiple studies showed that long-term influence operations using pre-internet media such as newspapers, radio, and television can be successful at causing voters to support a particular political party.3 For example, populations in Ukraine and Taiwan appeared more likely to vote for pro-Russian or China-endorsed candidates, respectively, after repeated exposure to foreign-supported television channels.4 A separate study of Americans exposed to negative images of Ukraine by Russian media found this exposure decreased approval and perceptions of Ukraine by 10 percent.5
Long-term mass media campaigns have also affected the behavior of those people exposed to them. In particular, several studies have shown that targeted information operations can lead to increased political violence in settings of conflict or civil unrest.6 For example, mass media exposure to strong Nazi propaganda over many years was found to increase German soldiers’ risk-taking during World War II combat.7
Short-Term Social Media Operations
Another set of studies examined the short-term effects of social media–based influence operations. Some of the studied operations caused shifts in political beliefs and behavior, increased xenophobic or discriminatory sentiments, and increased skepticism and uncertainty around vaccines and medical information.8 Such findings are consistent with an earlier body of research on political advertising, which generally has “persuasive but short-lived influence on citizens.”9
Social media operations can affect more than just beliefs. Short-term shifts in social media activity by extremely prominent actors—Germany’s far-right AfD party, for example—can also have modest, statistically detectable impacts on racially motivated violence in a given area.10
While the existing literature provides important insights, it also has significant gaps. On the whole, empirical research does not yet adequately answer many of the most pressing questions facing policymakers.
Medium of Influence
A vast majority of these studies (74 percent) examined the effects of influence operations carried out through traditional mass media. While traditional mass media remain important channels for influence operations, such studies do not directly address the role of the internet and social media, the primary focuses of many policymakers. More research is needed to learn whether online influence operations have different effects than their off-line counterparts—for example, due to the increased role of social networks and algorithms.
Only twenty-one of the eighty-two studies examined influence operations on social media. Of these, fourteen focused solely on Facebook or Twitter. The two platforms have outsized importance in the Western world and a documented history of influence operations. But the same is true of YouTube and Instagram, which have received far less attention from researchers.
The predominant focus on two major Western platforms means there has been little study of platforms popular in other parts of the world. It also means that we cannot compare how the effects of influence operations may vary based on a platform’s size, function, architecture, or algorithms. Further, none of the studies included in the review examined cross-platform or multi-platform influence operations. Yet experts see all of these aspects as important focus areas for policymakers.11
Only one study in this review examined what has arguably been the greatest focus of policymakers since 2016: the threat of foreign governments using social media to sway voters in democratic elections. The study, examining efforts by Russia’s Internet Research Agency during the 2016 U.S. presidential election, found no effect on the beliefs of American Twitter users.12 The dearth of studies in this area, and their limited findings so far, suggest major disconnects between how policymakers and the research community perceive this threat.
Most studies in this review examined long-term exposure (years) or short-term exposure (days) to influence operations, but no study evaluated impacts over months or weeks. In particular, short-term studies often lacked follow-up observations necessary to gauge the continued duration of an influence operation’s initial effect.
This research gap is significant because many policy interventions have focused on the weeks and months immediately surrounding a sensitive event. For example, U.S. Cyber Command reportedly disrupted Russia’s Internet Research Agency around the time of the 2018 midterm elections, and major social media platforms instituted many new policies and product design tweaks in the months before and after the 2020 U.S. election.13 We do not currently know whether influence operations are effective on the same time scale that these policies operate.
No study in this review directly tested for potential variations in effectiveness between different influence operations tactics. For example, there has been substantial work on how automated social media “bots” can impact those people exposed to them, but it is unclear how so-called bot tactics compare to other tactics that might also be available to influence operators. Such research could reveal the relative cost-effectiveness of influence operations tactics and thus help shape efforts to deter or disrupt bad actors.
The eighty-two studies covered a wide range of countries, including Argentina, Austria, Brazil, Chile, China, Croatia, Germany, India, Indonesia, Italy, Mali, Mexico, Netherlands, Nigeria, Norway, Poland, Russia, Rwanda, Spain, Taiwan, Uganda, Ukraine, the United Kingdom, and the United States.
That said, studies involving U.S.-based populations constituted 27 percent of the total. Moreover, studies of foreign government–sponsored political influence operations have focused almost entirely on Russian campaigns, despite that more than twenty other countries have a proven capacity to conduct such operations.14 To help inform policy on an international and global scale, research should examine a broader range of victim and perpetrator countries.15
The good news is that the research community is moving swiftly to address many of these gaps. As figure 1 demonstrates, there has been a dramatic increase since 2016 in the number of studies meeting our selection criteria—about 60 percent were published in the last two years alone. The COVID-19 “infodemic” has further catalyzed research; roughly one-fifth of the 2020 studies dealt with pandemic misinformation.
Still, the research gaps are serious and long-standing. There are inherent difficulties in establishing causality and in accounting for complex factors like cultural and political context when measuring influence operations effects. Filling key gaps will likely take many years and substantial investments by a range of institutions to empower researchers. Today, researchers often struggle to access important data held by third parties, such as platforms and governments. They also face inadequate funding, misaligned professional incentives, disciplinary silos, and nonstandard terms and methodologies.16
In the long run, new models of research collaboration will be needed to address these barriers and enable better measurement of influence operations effects.17 Multi-stakeholder collaboration would leverage the unique strengths of industry, academia, and government. Such collaborations can generate the evidence needed to support society’s response to challenges in the information environment.
View the Database
1 Victoria Smith, “Mapping Worldwide Initiatives to Counter Influence Operations,” Carnegie Endowment for International Peace, December 14, 2020, https://carnegieendowment.org/2020/12/14/mapping-worldwide-initiatives-to-counter-influence-operations-pub-83435.
2 While the reviewed studies were all published between 1995 and 2020, some examined influence operations that took place much earlier.
3 Leonid Peisakhin and Arturas Rozenas, “Electoral Effects of Biased Media: Russian Television in Ukraine,” American Journal of Political Science 62, no. 3 (2018): 535–550; and Stefano DellaVigna and Ethan Kaplan, “The Fox News Effect: Media Bias and Voting,” Quarterly Journal of Economics 122, no. 3 (2007): 1187–1234.
4 Peisakhin and Rozenas, “Electoral Effects of Biased Media”; and Jay C. Kao, “How the Pro-Beijing Media Influences Voters: Evidence From a Field Experiment,” University of Texas at Austin, December 2020, https://www.jaykao.com/uploads/8/0/4/1/80414216/pro-beijing_media_experiment_kao.pdf. In the case of Taiwan, however, a backfire effect was reported if the targeted audience had a preexisting negative view of China or perceived the outlet as associated with the Chinese government.
5 Aleksandr Fisher, “Demonizing the Enemy: The Influence of Russian State-Sponsored Media on American Audiences,” Post-Soviet Affairs 36, no. 4 (2020): 281–296.
6 David Yanagizawa-Drott, “Propaganda and Conflict: Evidence From the Rwandan Genocide,” Quarterly Journal of Economics 129, no. 4 (2014): 1947–1994; and Maja Adena, Ruben Enikolopov, Maria Petrova, Veronica Santarosa, and Ekaterina Zhuravskaya, “Radio and the Rise of the Nazis in Prewar Germany,” Quarterly Journal of Economics 130, no. 4 (2015): 1885–1939.
7 Benjamin Barber IV and Charles Miller, “Propaganda and Combat Motivation: Radio Broadcasts and German Soldiers’ Performance in World War II,” World Politics 71, no. 3 (2019): 457–502.
8 Christopher A. Bail, Lisa P. Argyle, Taylor W. Brown, John P. Bumpus, Haohan Chen, M. B. Fallin Hunzaker, Jaemin Lee, Marcus Mann, Friedolin Merhout, and Alexander Volfovsky, “Exposure to Opposing Views on Social Media Can Increase Political Polarization,” Proceedings of the National Academy of Sciences 115, no. 37 (2018): 9216–9221; Ruben Enikolopov, Alexey Makarin, and Maria Petrova, “Social Media and Protest Participation: Evidence From Russia,” Econometrica 88, no. 4 (2020): 1479–1514; Holger Lutz Kern, “Foreign Media and Protest Diffusion in Authoritarian Regimes: The Case of the 1989 East German Revolution,” Journal of Psychoeducational Assessment 44, no. 9 (2011): 265–72, https://doi.org/10.1177/0734282913508620; Matthew L. Williams, Pete Burnap, Amir Javed, Han Liu, and Sefa Ozalp, “Hate in the Machine: Anti-Black and Anti-Muslim Social Media Posts as Predictors of Offline Racially and Religiously Aggravated Crime,” British Journal of Criminology 60, no. 1 (2020): 93–117; Leonardo Bursztyn, Georgy Egorov, Ruben Enikolopov, and Maria Petrova, “Social Media and Xenophobia: Evidence From Russia,” no. w26567, National Bureau of Economic Research, December 2019, https://www.nber.org/papers/w26567; and Man-pui Sally Chan, Kathleen Hall Jamieson, and Dolores Albarracin, “Prospective Associations of Regional Social Media Messages With Attitudes and Actual Vaccination: A Big Data and Survey Study of the Influenza Vaccine in the United States,” Vaccine 38, no. 40 (2020): 6236–6247.
9 Matthew P. Motta, and Erika Franklin Fowler, “The Content and Effect of Political Advertising in U.S. Campaigns,” in Oxford Research Encyclopedia of Politics, December 22, 2016, https://oxfordre.com/politics/view/10.1093/acrefore/9780190228637.001.0001/acrefore-9780190228637-e-217.
10 Karsten Müller, and Carlo Schwarz, “Fanning the Flames of Hate: Social Media and Hate Crime,” Journal of the European Economic Association, October 30, 2020, https://academic.oup.com/jeea/advance-article-abstract/doi/10.1093/jeea/jvaa045/5917396?redirectedFrom=fulltext.
11 Victoria Smith and Natalie Thompson, “Survey on Countering Influence Operations Highlights Steep Challenges, Great Opportunities,” Carnegie Endowment for International Peace, December 7, 2020, https://carnegieendowment.org/2020/12/07/survey-on-countering-influence-operations-highlights-steep-challenges-great-opportunities-pub-83370; and Ben Nimmo, “The Breakout Scale: Measuring the Impact of Influence Operations,” Brookings Institution, September 2020, https://www.brookings.edu/research/the-breakout-scale-measuring-the-impact-of-influence-operations/.
12 Christopher A. Bail, Brian Guay, Emily Maloney, Aidan Combs, D. Sunshine Hillygus, et al., “Assessing the Russian Internet Research Agency’s Impact on the Political Attitudes and Behaviors of American Twitter Users in Late 2017,” Proceedings of the National Academy of Sciences 117, no. 1 (2020): 243–250.
13 Ellen Nakashima, “U.S. Cyber Command Operation Disrupted Internet Access of Russian Troll Factory on Day of 2018 Midterms,” Washington Post, February 27, 2019, https://www.washingtonpost.com/world/national-security/us-cyber-command-operation-disrupted-internet-access-of-russian-troll-factory-on-day-of-2018-midterms/2019/02/26/1827fc9e-36d6-11e9-af5b-b51b7ff322e9_story.html; and Kamya Yadav, “Platform Interventions: How Social Media Counters Influence Operations,” Carnegie Endowment for International Peace, January 25, 2021, https://carnegieendowment.org/2021/01/25/platform-interventions-how-social-media-counters-influence-operations-pub-83698.
14 Diego A. Martin, Jacob N. Shapiro, and Julia Ilhardt, “Trends in Online Foreign Influence Efforts,” Version 2.0, Princeton University, August 5, 2020, https://drive.google.com/file/d/18QIENHZslNIoKvOu72iEjG6RgWL1Dww_/view.
15 Vishnu Kannan, Carissa Goodwin, and Brawley Benson, “Community Perspectives on Diversity in the Countering Influence Operations Field,” Carnegie Endowment for International Peace, March 30, 2021, https://carnegieendowment.org/2021/03/30/community-perspectives-on-diversity-in-countering-influence-operations-field-pub-84198.
16 Victoria Smith and Natalie Thompson, “Survey on Countering Influence Operations Highlights Steep Challenges, Great Opportunities,” Carnegie Endowment for International Peace, December 7, 2020, https://carnegieendowment.org/2020/12/07/survey-on-countering-influence-operations-highlights-steep-challenges-great-opportunities-pub-83370.
17 Jacob N. Shapiro, Natalie Thompson, and Alicia Wanless, “Research Collaboration on Influence Operations Between Industry and Academia: A Way Forward,” Carnegie Endowment for International Peace, December 3, 2020, https://carnegieendowment.org/2020/12/03/research-collaboration-on-influence-operations-between-industry-and-academia-way-forward-pub-83332.