vera.ai at #Disinfo2025 - a recap

From 15 to 16 October 2025, vera.ai partner EU DisinfoLab hosted its annual conference, #Disinfo2025, at Cankarjev Dom in Ljubljana, Slovenia, bringing together 600+ participants from across Europe and beyond. 

This year’s edition tackled the evolving disinformation landscape through four thematic tracks and an extensive programme of keynotes, panels, workshops, and side events. Discussions spanned a wide range of issues, including election integrity and foreign interference, AI-driven attacks (and AI-tools to counter them), climate disinformation, healthcare manipulation, and attacks on public services, platform accountability and enforcement of the Digital Services Act (DSA), as well as tools and frameworks for detecting, investigating, and attributing disinformation. 

The conference also explored civil society protection and resilience-building, and examined geopolitical influence operations in regions such as the Middle East, Indo-Pacific, and Eastern Europe, highlighting the complexity of today’s information threats and the collective commitment to countering them.

Ljubljana in October 2025Ivan Srba (KInIT)

vera.ai booth

Not only the overall attendance was high, so was interest in vera.ai and its tools. Numerous participants from diverse backgrounds and disciplines approached the vera.ai booth to learn more about what we do. Journalists, fact-checkers, researchers and academics, media literacy experts, policy makers, and disinformation analysts were among those interested in the project results. 

The vera.ai team engaged in interesting discussions and showcased several project tools, including the verification plugin, the Truly Media platform, the CoorTweet service, and the synthetic content detectors. Discussions focused on the opportunities and potential limitations of the vera.ai technologies, their integration with other platforms and workflows, and their availability – as many participants were eager to know how they could access and apply the tools in their own work.

Participants and project partners exchange ideas at the vera.ai boothEUDL

vera.ai demo session

On the second day of the conference, during a session showcasing AI tools for the community, Valentin Porcellini (AFP) presented three key tools from the Verification Plugin, developed or enhanced within the vera.ai project: the Keyframes tool, the Synthetic Image Detection tool, and the upcoming Social Network Analysis tool.

The Keyframes and Synthetic Image Detection tools were demonstrated using examples from AFP fact-check articles. Valentin also showcased these tools at the vera.ai booth to interested participants of the conference. He explained the Keyframes tool and its general workflow, which includes reverse image searches on search engines, and introduced a new use case for identifying SynthID videos and images using Google’s About This Image feature.

Valentin Porcellini (AFP) showcasing vera.ai toolsDanae Tsabouraki (ATC)

The limitations of the Synthetic Image Detection tool were discussed, emphasizing the importance of using an image as close as possible to the original output from the GenAI model to achieve optimal detection performance. This helps avoid issues caused by resampling, resizing, cropping, and other modifications that can negatively impact detection accuracy.

Finally, Valentin introduced the new Social Network Analysis tool, a future addition to the Verification Plugin. This tool will enable the identification of Coordinated Inauthentic Behavior (CIB) using CoorTweet, developed within vera.ai by the University of Urbino, and D3LTA, a tool created by Viginum, the French FIMI agency. This functionality will be available in a future update of the Verification Plugin.

To sum up briefly: a very worthwhile event to attend for vera.ai partners and the project. Many new contacts were made, while old ones were refreshed. Disinfo2025 provided an excellent opportunity to showcase project work, approaches taken, outcomes, and remaining challenges.

 

Authors: Inès Gentil (EUDL), Danae Tsabouraki (ATC), Valentin Porcellini (AFP)

Editor: Anna Schild (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.