Some formal background on the special situation of the EBU in vera.ai

The vera.ai project was originally designed to run for three years until 15 September 2025. In order to complete all foreseen works and provide accurate end-of-project reporting, the consortium decided to request an extension of the project until the end of November 2025. This was kindly granted by the European Commission as the main contractual and co-financingy entity.  

veraa.i project partner EBU (European Broadcasting Union), based in Switzerland cooperated in the project with slightly different contractual arrangements. They were funded by the Swiss authorities - hence their contractual relationship was with the respective Swiss administration, rather than the EU, as was the case for the rest of the partners. All this said: EBU decided not to prolong their involvement and formally terminated their work on 15 September 2025, as originally foreseen.

In this summary article, first published on the EBU website (slightly adapted here), EBU colleague Lalya Gaye wraps up their involvement in the project and looks ahead.

And that's a wrap! Conclusion of the veraAI project on AI against disinformation, and next steps

After 3 years of research and development of AI-based fact-checking tools, both for and with media professionals, the Horizon Europe project veraAI is coming to an end - with the EBU's part ending on 14 September 2025.

The EBU contributed to the project through a number of activities – most of them in close collaboration with colleagues at Deutsche Welle – including participatory design workshops with members and other media professionals to determine user needs and design requirements, followed by evaluation activities together with users; development of algorithms for audience and authorship estimations; communication on the project's progress through EBU channels including the tech-i app; a series of webinars about the research and science behind the project (more information about this here and here); methodology and communication training for consortium partners; a series of training workshops for EBU members and a catalogue of video tutorials; a report on the potential of C2PA and veraAI tools; a handbook on the legal and ethical obligations to developers and deployers of AI-based fact-checking tools; as well as various presentations and demos at EBU events such as the AI Summit, DTS, PTS, and the EBU zone at IBC.

Get the veraAI tools for your organisation!

You might be wondering: how can my team and I access the verification tools produced by the project in order to support our fact-checking and journalistic work?

A number of veraAI tools are now available through the following platforms (besides a few remaining standalone versions of which some will be integrated into these platforms in the period ahead):

  • Verification Plugin (VP): A free toolbox of verification tools for text, audio and video as a Chrome browser extension, aimed at journalists, fact-checkers and human rights defenders but also open to the public.
  • Truly Media platform (TM): A platform for collaborative work among verification professionals that allows them to aggregate information from the web, organize them into collections, and analyse them with the help of integrated third party verification tools.

You can install the Verification Plugin on your browser using this link. For the Truly Media platform, please visit this page to fill in an interest form (bottom of the page) and you'll be contacted by the platform owners. You can also request a demo here.

The tools currently integrated into these platforms, and those available as standalone versions, are the following – with more integrations underway:

Synthetic media detectors

  • Synthetic Image Detector: A detector that provides the likelihood of an image being AI-generated (VP+TM).
  • Synthetic Audio Detector: A detector that provides the likelihood of parts of an audio file containing synthetic speech (TM).
  • Machine-Generated Text (MGT) Detector: A detector for potentially machine-generated text, as a detailed breakdown of various likelihood for each portions of text (as part of the VP 'Assistant'), and as an overall likelihood (WP+TM).
  • Geolocalizer (VP): A tool that extracts visual clues from an image to estimate its geographical location.
  • Image Forensic AI Filters (VP): A tool that detect alterations in manipulated images.
  • Deepfake Video Detector (VP): A tool that returns the probability that a video contains AI manipulated faces (face swapping and face reenactment).

Analytical support

  • Database of Known Fakes: A searchable database of already reviewed claims from trustworthy fact-checking and news organisations (standalone).
  • Credibility Signals: A text analyser indicating the use of persuasion techniques in a text, with levels of analytical confidence (VP as part of the 'Assistant', together with the MGT detector).
  • Coordinated Sharing Detection Service: A coordinated sharing network analysis tool to explore the spread and behaviour of disinformation campaigns on social media (standalone).
  • Keyframe Selection and Enhancement Service: A tool providing main relevant keyframes and enhanced faces and text from a video, for accelerated video analysis work (standalone and TM).

An up-to-date list will be made available on the project website soon (now available here).

You can also learn how to use some of these tools with this video tutorial series!

For more information about the project from an EBU pertspective, please contact Hans Hoffmann (EBU Technology & Innovation) and/or the veraAI project lead.

We wish to thank EBU Members and other project participants for their valuable contributions to the project, and we hope you will find some of the outcoimes useful. They were made for you!

Author: Lalya Gaye, (EBU)

Editor: Jochen Spangenberg (DW)

The veraAI project is co-financed by the European Union, Horizon Europe programme, Grant Agreement No 101070093, with additional funding from Innovate UK grant No 10039055 and the Swiss State Secretariat for Education, Research and Innovation (SERI) under contract No 22.00245.

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.