UvA Annual Winter School

The Digital Methods Initiative (DMI), Amsterdam, is holding its annual Winter School on ’The use and misuse of Open Source Intelligence (OSINT)’. It takes place in Amsterdam from 9-13 January 2023. Topic: The use and misuse of OSINT. Selected format: a (social media and web) data sprint, with tutorials as well as hands-on work for telling stories with data. There is also a programme of keynote speakers including Forensic Architecture. 

The Winter School is intended for advanced Master's students, PhD candidates and motivated scholars who would like to work on (and complete) a digital methods project in an intensive workshop setting.

UvA image 1.jpeg
Working togetherUvA / Digital Methods team

The use and misuse of Open Source Intelligence (OSINT)

From geolocating burning tanks in the Ukrainian fields and determining the authenticity of videos depicting possible human rights violations in Cameroon to reconstructing the events of January 6, 2021 in the Capitol building in Washington, D.C., activists, journalists, and the general public are increasingly turning to a (somewhat) new ally: Open Source Intelligence (OSINT). The systematization of information gathered from open, often internet-based, sources (as opposed to the classified sources of governmental intelligence) using digital tools, as OSINT may be defined, is turning into a highly regarded strategy to build public narratives of truth.

A fairly new domain in journalism

Recently, major news outlets such as The Guardian, The New York Times, and the BBC have added OSINT units or ‘visual investigation’ teams to their journalistic operations. They join more established investigation units from international civil society such as those of Amnesty International and Human Rights Watch, academic research institutions such as Berkley's Human Rights Investigations Lab, and citizen intelligence agencies like Bellingcat. 

OSINT - a valuable resource

Recent news articles have described the rise of open source intelligence as challenging existing authorities, especially state-controlled or other official information sources. In doing so, OSINT practitioners have developed a particular set of reporting formats and verification tools that strengthen the epistemic authority of their practices. For example, crowd-sourced information from Twitter or Telegram is arranged next to video stills pulled from popular platforms or satellite images from public providers in order to put together and strengthen the argument of what actually happened.

OSINT has developed a signature 'investigative aesthetic' and style, dramatised and popularised by organisations striving for justice, such as Forensic Architecture, or preserving the memory of wars, such as the Syrian Archive. 

OSINT also taps into the transparency strategies of the open source ethos and the encouragement of DIY hacker culture. Professional and lay OSINT practitioners curate lists of tools, draft how-to guides and share training materials on YouTube and elsewhere to empower others to undertake similar work. These have been applied to projects that fight climate change and trace illicit money flowing through circuitous corporate structures. They have also found an eager audience in those who dig for and post updates about conspiracy theories such as QAnon.

Whether it is misuse or weaponisation, OSINT practices and styles have also been adopted by misinformation operatives such as 'War on Fakes'.

Aims and approach of the UvA Winter School in January 2023 

The Winter School takes up OSINT as an investigative practice and aesthetic. It offers critical research projects on data journalism, fact-checking and other investigative projects employing online data. It also combines OSINT tools with digital methods and other online research techniques for academic research that make use of verification. Finally, it analyses its cultures of practice and how it establishes and undermines others' epistemic authority.

At the Winter School there are the usual social media tool training tutorials for working on single and cross-platform analysis, but also continued attention to thinking through and proposing how to work critically with social media data both from mainstream social media platforms as well as so-called ‘alt tech’.

Apart from the keynotes and the training tutorials, participants will also work on empirical and conceptual projects. 

UvA image 2.jpeg
Science in actionUvA / Digital Methods team

Past projects

For orientation and to give you some more ideas and guidance: Projects from the past Summer and Winter Schools included:

  • Detecting Conspiratorial Hermeneutics via Words & Images, Mapping the Fringe on Telegram; 
  • Greenwashing, in_authenticity & protest;
  • Searching constructive/authentic posts in media comment sections;
  • Mapping deepfakes with digital methods and visual analytics;
  • “Go back to plebbit”: Mapping the platform antagonism between 4chan and Reddit;
  • Profiling Bolsobots Networks;
  • Infodemic cross-platform analysis;
  • Post-Trump Information Ecology;
  • Streams of Conspirational Folklore, and FilterTube: Investigating echo chambers, filter bubbles and polarization on YouTube. 

The most recent event, in turn, included some of the following projects:

  • Climate imaginaries;
  • Repurposing Google Ads;
  • What is a meme, technically speaking?;
  • Tracing the genealogy and change of TikTok audio memes;
  • Google Autocomplete: Racist results still?;
  • OK Boomer on Twitter.

Getting involved - get your place

If you want to get involved in the January 2023 edition, do not waste any time and get in touch with the organisers Richard Rogers, Kamila Koronska and Guillen Torres, all located at the Media Studies Department at the University of Amsterdam. Late entries are still possible (the very very latest being 5 January 2023) - even though the originally set deadline was 1 Dec 2022. 

More information about the event and procedures can be found on the Digital Methods website, which also formed the basis of this article.

 

Authors: Richard Rogers & Kamila Koronska (UvA)

Editor: Jochen Spangenberg (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.