vera.ai at the EBU's Data Technology Seminar 2023

The Data Technology Seminar is an annual event at which participants can network with like-minded professionals, learn about the latest trends and techniques in data-related projects, and collaborate with peers in the industry.

The event started on 21 March 2023 in Geneva with fruitful talks and discussions. It lasted for three days until 23 March.

vera.ai was there, represented by its coordinator CERTH!

Being a hybrid event, CERTH was represented remotely by Olga Papadopoulou, a member of the Media Analysis, Verification and Retrieval Group (MeVer) and the overall project manager and coordinator of vera.ai.

The screenshot was taken during the event from zoom.Olga Papadopoulou

Olga presented The MeVer toolset for tackling media disinformation, starting with a brief introduction, moving on to visual disinformation thereafter. Visual disinformation poses increased risks because it

  • can be more persuasive than text,
  • attracts more attention,
  • is more tempting to share,
  • can easily cross borders.

Olga first presented and then demonstrated, with disinformation examples, the functionalities of the different tools.
 

The Image Verification Assistant 

The Image Verification Assistant highlights areas in an image that have been digitally manipulated. The tool supports the following functionalities:

  • it integrates 13 image forensics algorithms,
  • the Fusion Algorithm aggregates filters and allows metadata inspection (EXIF, IPTC, Photoshop and more),
  • it estimates the software/hardware origin of JPEG images,
  • it supports quick reverse image searches on Google.

Example: a flag was digitally added to the image depicted below which was shared during the Catalan referendum to denounce the actions of the Spanish authorities. The Fusion Algorithm detects the tampered area and highlights it.

Example of an analysis Olga Papadopoulou / MeVer team

DeepFake Detection Tool

The DeepFake Detection tool, in turn, assesses the probability of deepfake face manipulations in images and videos. In the case of video, the tool segments the video in shots, detects the depicted faces in each shot and estimates the probability of deepfake face manipulation in each face.

Example: A video with Donald and Melania Trump in which the face of Putin appears instead of Melania's. The video is detected with a likelihood of 94% being a deepfake.

Deepfake detection in actionOlga Papadopoulou / MeVer team

Location Estimation Tool

The Location Estimation tool concerns out-of-context images that are published as breaking news while being taken at different locations or during earlier events. The tool infers the depicted location using solely visual cues.

Example: A Facebook post shared the below image and claimed that a tsunami hit the southern Philippines. Using the visual Location Estimation tool, the depicted location was detected as Rio de Janeiro.

The Location Estimation toolOlga Papadopoulou / MeVer team

Near Duplicate Detection and the MeVer Network Analysis and Visualization Tool

The Near Duplicate Detection and the MeVer Network Analysis and Visualization tools were also demonstrated for finding similar content and analysing social media conversations respectively.

For more information about the entire event go to the EBU webpage, also find Olgas presentation here. The work of the MeVer team, including more insights and links, can be found on the group's team website.
 

Authors: Olga Papadopoulou & Symeon Papadopoulos (CERTH)

Editors: Anna Schild & Jochen Spangenberg (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.