A cross-cutting team of University of Maryland researchers has been awarded $1.5 million from the U.S. Department of Defense (DOD) to study the spread of information campaigns by examining how emotion affects whether someone will re-share content online.
The UMD research team will collect real-world Facebook and YouTube data in multiple languages to examine the emotional content and viral reach of the posts, as well as the emotional reactions of those annotating the posts. The funding is one of 12 awards this year from DOD’s Minerva Research Initiative, which supports university-based, unclassified social science research aimed at improving basic understanding of the social, cultural, behavioral, and political forces that shape regions of the world of strategic importance to the U.S. Among the 12 new Minerva projects, UMD’s is one of only three led by women.
Based out of UMD’s Applied Research Laboratory for Intelligence and Security, the team will collect and annotate a sample of 1,000 public Facebook posts and 300 YouTube videos from both Poland and Lithuania that were shared by social and political influencers from those countries. Both countries have often been targeted by Russian information warfare and have strategic relevance to NATO and Europe. Once complete, these annotations will be used to explore the relationship between emotion and the sharing of narratives.
“Whether using outright disinformation or manipulating public opinion with accurate stories, information warfare involves stories shared on social media platforms with specific embedded narratives designed to provoke, enrage, excite and change behavior,” said Susannah Paletz, Principal Investigator of the project, Research Professor in the UMD College of Information Studies and ARLIS Affiliate.
A social psychologist, Paletz has been studying social media for five years for the Office of Naval Research and, in a project last summer, developed an innovative coding scheme to annotate emotions that inspired this new effort.
Reflecting the current, nuanced approach to the psychology of emotions, these annotations extend beyond the so-called six basic emotions – anger, disgust, fear, happiness, sadness and surprise. Paletz and her colleagues’ annotation scheme includes longer-lasting emotions critical for everyday life and online interaction, including humor, wonder, nostalgia, relief, love and hate. This scheme includes over 20 distinct emotions particularly relevant to the challenge at hand, and the team continues to refine the list.
“We also included something called kama muta, which is an emotion of feeling heart warmed when you see something infantile – in other words, the ‘awww!’ feeling you get when you see something cute. Given that this annotation scheme was for social media, we felt that emotion would be important to collect,” Paletz said.
For their Minerva project, Paletz and her team will work with native speakers at universities in Poland and Lithuania to complete the annotation. Small groups of annotators will first independently use the annotation scheme to judge each of the collected social media posts for each emotion, rating them 0 to 100, for both the content of the social media post and for their own reaction. Then, they will come together to make consensus agreements on the content ratings of the posts where they disagreed.
The results are expected to show how eliciting specific emotions (e.g., anger, contempt, humor) can help a narrative – truthful or not – go viral. The researchers will investigate whether messages evoking emotions that encourage action are more likely to be re-shared. If disinformation provokes the right emotions, people will be more likely to spread it regardless of its accuracy or truth.
Drawn from several different disciplines, Paletz’s research team includes co-PI Anton Rytting, a UMD computational linguist in the Applied Research Laboratory for Intelligence and Security (ARLIS); Cody Buntain, a UMD computer scientist (who will soon join the faculty at the New Jersey Institute of Technology); Devin Ellis, a UMD policy expert, in the National Center for the Study of Terrorism & Response to to Terrorism (START); Ewa Golonka, a UMD Russian linguist and social scientist with ARLIS; and UMD’s Egle Murauskaite, a START expert on unconventional security threats.
“These kinds of difficult issues – adversarial disinformation campaigns, hacking, Russian interference – can’t really be solved without us working collaboratively across disciplines,” Paletz said. “Psychologists, computer scientists and information scientists have all been examining online communities for decades and social media for years – but within the confines of their own disciplines. It's been rare for them to work together, as we are, to really integrate methods and theories.”
The project will also address critical gaps in research about how information travels through populations and across national boundaries and languages. The researchers will develop methods for detecting and tracking how narratives and other memes spread within and across languages. Buntain also expressed an interest in the individuals behind disinformation campaigns.
“I think one thing people get wrong about these kinds of efforts is the characterization that agents responsible for disinformation are highly coordinated, skilled men in dark suits who are pulling strings behind the curtain,” Buntain said. “Instead, I think reality is something closer to a bunch of young people who are getting paid to post online and support a few high-level messages, but are otherwise given a lot of latitude in how they do it. Rather than being experts at propaganda or disinformation, I think many of these individuals are using marketing tools exactly as they were meant to be used, but with an unanticipated intent – are the tools Coca-Cola or Exxon-Mobil uses to market its product all that different? – and these people find what works to get engagement, followers and clicks.”
The research team has already begun identifying Polish and Lithuanian politicians and social influencers. This summer and into the fall, they will collect social media data. Over the next academic year, research assistants in Poland and Lithuania who will be supervised by Golonka and Murauskaite, respectively, will begin conducting the emotion annotation. The team will also examine different types of narratives and will conduct multilevel statistical analyses to understand which emotions and narratives predict social media sharing. The three-year project is expected to conclude in the summer of 2022.
The Applied Research Laboratory for Intelligence and Security (ARLIS), based at the University of Maryland College Park, was established in 2018 under the auspices of the Office of the Under Secretary of Defense for Intelligence (OUSD(I)) and the US Air Force Office of Concepts, Development, and Management, intended as a long-term strategic asset for research and development in artificial intelligence, information engineering, and human systems. One of only 14 designated Department of Defense University Affiliated Research Centers (UARCs) in the nation, ARLIS conducts both classified and unclassified research spanning from basic to applied system development and works to serve the US Government as an independent and objective trusted agen