In recent years, the fast-paced progress of AI technologies and their increasing impact on society has led to the development of laws and regulations around the development and deployment of AI systems, both at national and European levels. In parallel, ethical guidelines have flourished in the media industry, both around editorial and non-editorial issues related to AI.
With this as a backdrop, it is essential for developers and deployers of AI-based tools for fact-checking and journalism to familiarise themselves and adhere to these new legal and ethical rules, and also follow and implement the obligations that come with them.
The EBU's 'Handbook on the legal and ethical obligations to developers and deployers of AI-based fact-checking tools' provides an accessible guide to exactly these rules and obligations, especially under the EU AI Act, and with a particular focus on the development of AI-based tools for journalism.
It outlines key obligations regarding the development and use of AI tools, focusing mainly on high-risk and general-purpose AI systems. Topics include the legal obligations under the EU AI Act, data protection under the General Data Protection Regulation (GDPR), and ethical considerations such as accuracy, human oversight, transparency, and privacy.
Designed for AI developers and deployers, it serves as a hopefully helpful first reference for building and deploying AI fact-checking tools responsibly.
Below, you can read or download the publication (also accessible via link here).
Author: Lalya Gaye
Editor: Jochen Spangenberg