Editor‘s note: Several vera.ai project partners and respective teams, supported by e.g. students from other entities and affiliations, participated in the 2023 Winter School and Data Sprint, organised by the University of Amsterdam’s Digital Methods Initiative. It took place from 9-13 January 2023 in Amsterdam. The occasion was also used to work on several projects and research undertakings that are part of the overall vera.ai activities, aims and goals. Here, a team coordinated by the University of Urbino Carlo Bo (UNIURB), led by Fabio Gigliettoreport about what they did during the 2023 DMI Winter School in one of their projects. It dealt with investigating dynamics of (mis)information spreading in online networks. 

Uncovering Misinformation in Russia's War Against Ukraine: Insights from the "Digital Methods Initiative Winter School and Data Sprint 2023”

The Russian invasion of Ukraine and misinformation

Since the Russian invasion of Ukrainian territories occurred in February 2022, many scholars and journalists have observed a rise in the spread of problematic information across Europe. Such problematic information has the intention of distorting people’s perception of the causes of the conflict, creating a hostile environment for supporting the Ukrainian side. At the "Digital Methods Winter School and Data Sprint 2023" held in Amsterdam on January 9-13, we proposed a project aimed at investigating the misinformation circulated on social networks about the invasion. 

The proposed project applies methodologies developed within WP4 of the Vera.ai project, which aims at developing a technological platform that employs a content-agnostic approach to surface potentially problematic content, narratives, and meta-narratives by monitoring the performance of content posted by a set of problematic actors. In particular, we created a scheme aimed at mapping the leading networks of actors sharing problematic information on the topic.

Given the fuzziness of misinformation-related phenomena and the frequency at which malicious actors change their strategies, we adopted an approach that shifts the focus from the analysis of content to the dynamics of information spreading within online networks. Therefore, we propose an approach that summarizes the interplay between manipulative Actors, deceptive Behaviour, and harmful Content, called A-B-C approach (François 2019), mapping and analyzing networks of social media actors that engage in coordinated link sharing behaviour (CLSB) to spread problematic content. More precisely, CLSB refers to the coordinated shares of the same news articles in a very short time by networks of social media entities (Giglietto et al. 2020).

What we did during the DMI Winter School 2023

The UNIURB team was thrilled to facilitate a week-long research project at the DMI Winter School 2023. Overall topic of the event: The use and misuse of Open Source Intelligence (OSINT). 

Our project brought together a group of 15 participants, the maximum allowed for the project, to collaborate and conduct research in a selected field. In addition to the research project, the team also delivered two informative tutorials on detecting CLSB with CooRnet. These tutorials were attended by 30 participants and helped set the stage for a productive week of learning and discovery. To ensure that participants hit the ground running, the team also prepared a dataset for them to explore and utilize throughout the project.

In a previous work - Global CLSB maps 2022 - we identified 818 coordinated accounts (115 Facebook Pages and 703 public groups organized in 95 networks) that performed “coordinated link sharing behaviour” on Facebook by rapidly sharing at least four different news stories rated as problematic by Facebook third-party fact-checkers between January 2017 and December 2021. This list has been a useful starting point for searching for problematic information related to the Russian invasion of Ukraine, as the accounts on this list have previously disseminated problematic information on Facebook. 

Through CrowdTangle (CT), we created lists composed of these accounts (a list consisting of 115 pages and another consisting of 682 groups), retrieving from them published posts about the Ukrainian war. To retrieve posts we used specific keywords (Ukraine, Russia, Putin, Zelensky, Kiev, Zaporizhzhia, Donetsk, Mariupol, Kharkiv, Kherson, Luhans'k, Luhansk, Saporischschja, Donezk, Charkiw) within a time frame from 20/02/2022 to 03/12/2022. Through CooRnet, an ‘r package’ that detects coordinated link sharing behaviour (CLSB) on Facebook/Instagram, we extracted all links shared within these posts, and we then tracked the networks of accounts that shared these links in a coordinated manner within a short period of time.

Accounts circulating problematic content on the Russian invasion of Ukraine: between right-wing, left-wing and religious contexts.

Through the outlined process, a total of 1,509 entities that performed CLSB were identified. They were organized into 117 components and 122 clusters. To begin their exploration, each participant individually studied the map provided and chose one cluster to focus on for the duration of the week. From these individual preferences, three groups were formed, each devoted to studying one specific cluster in-depth.

The in-depth analysis of these clusters revealed a significant amount of problematic information circulating on various topics. This included posts that supported pro-Russian positions. This discovery highlighted the importance of monitoring and understanding the spread of information on social media and the potential impact it can have on individuals and society as a whole. 

We observed how, as the war in Ukraine became relevant and newsworthy, known coordinated networks jumped on the bandwagon to exploit the attention devoted to the topic. We thus see clusters formerly devoted to spreading problematic information on Covid then moving over and starting to deal with the ongoing war. Besides discussing the war, the data points to the emergence of the role these groups play as a bridge between the populist far left and right.

This is evident in a French-speaking cluster (cluster n.5 in our analysis), where groups supporting the far-right politician Marine Le Pen share common content with those supporting the far-left politician Jean-Luc Melenchon. A similar pattern emerges within a component (component n.10) composed of Spanish-speaking South American clusters of ‘leftist’ groups (Peru, Venezuela, Argentina, Mexico) and a US-based conservative pro-Trump cluster. Commonly shared content supports pro-Russian and Putin positions.

Additionally, the study points out a "religious approach" to war often concealing a pro-Russian political stance. Many coordinated accounts publish content in which they ask for prayers for Ukraine, and criticize the war using religious arguments. Prayers for a ceasefire, along with the pacifist rhetoric conveyed by this side end up favouring Russia, which would benefit enormously from Western disengagement in the war in Ukraine. This rhetoric emerges prominently in a Brazilian Cluster (n.9) and in the aforementioned American component, where these kinds of entities and the religious pacifist narratives are the bridges among different political groups and even countries, for example, Brazilian pro-Bolsonaro and US extremely conservative. In the case of the United States, a religious group is even the node that connects the U.S. cluster with the Mexican one.

Project postersUniURB team

What we learned

As our recent analysis shows, the ongoing conflict between Russia and Ukraine is a highly sensitive topic that is prone to the spread of misinformation on social media. By monitoring accounts that have previously been identified as sharing problematic information, our team was able to observe a significant increase in activity related to the war.

Through the course of the "Digital Methods Winter School and Data Sprint 2023", we validated the effectiveness of our methodology for detecting and analysing misinformation. The insights gained from this project can serve as a valuable starting point for future research, both for those interested in studying specific clusters of activity, as well as for those looking to verify the discussion dynamics surrounding other topics.

Resources and supporting material

For more information see also the project wiki page on the DMI website

The process and outcome of the work are also summarized in two posters (1, 2) and a narrated video with a voice-over script.

The tutorial's documentation includes details of the methodology and reproducible code, stored on GitHub.

 

Author: Roberto Mincigrucci (UniURB)

Editor: Jochen Spangenberg (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.