
Report September 2025
Submitted
Executive summary
We are pleased to share our sixth report under the 2022 EU Code of Conduct on Disinformation, which also draws from our work with the Code’s Taskforce. In accordance with the subscription form submitted by Meta Platforms Ireland Limited (Meta) in January 2025, this report is being submitted by Meta in respect of the Facebook, Messenger, and Instagram services and on behalf of WhatsApp Ireland Limited in respect of the WhatsApp messaging service.
The aim of this report is to provide an update on how Meta approached misinformation and disinformation in the European Union between January and June 2025. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include:
- Elections: The National Elections chapter provides an overview of our work on elections within the EU, detailing our core policies, processes, and implementation strategies. It outlines our comprehensive approach to elections, which continued for European elections held in the first half of 2025. The election responses covered in this report include the parliamentary elections in Germany, the presidential and presidential runoff elections in Romania, the parliamentary elections in Portugal, and the presidential elections in Poland.
- Expanding GenAI Transparency for Meta’s Ads Products: We began gradually rolling out “AI Info” labels on ad creative videos using a risk-based framework. When a video is created or significantly edited with our generative AI creative features in our advertiser marketing tools, a label will appear in the three-dot menu or next to the “Sponsored” label. We plan to share more information on our approach to labeling ad images made or edited with non-Meta generative AI tools. We will continue to evolve our approach to labeling AI-generated content in partnership with experts, advertisers, policy stakeholders and industry partners as people’s expectations and the technology change.
- Media literacy: Meta published its first Media Literacy Annual Plan on 21 July 2025, which set out its current approach to media literacy and the products and features we make available to users of Facebook and Instagram. It also provided details on specific media literacy initiatives run by Meta, including its work on digital citizenship, its media literacy lessons in Get Digital, We Think Digital and Soy Digital, and its election literacy programs.
- Coordinated Inauthentic Behaviour trends: We are sharing insights into a covert influence operation that we disrupted in Romania at the beginning of 2025. We detected and removed this campaign before it was able to build authentic audiences on our apps.
Here are a few of the figures which can be found throughout the report:
- From 01/01/2025 to 30/06/2025, we removed over 5 million ads from Facebook and Instagram in EU Member States, of which over 83,000 ads were removed from Facebook and Instagram for violating our misinformation policy.
From 01/01/2025 to 30/06/2025, we labelled over 1.2 million ads on both Facebook and Instagram with “paid for by” disclaimers in the EU.
We removed 1 network for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more European countries (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q1 2025, we took action against 1 billion fake accounts and in Q2 2025, we took action against 687 million fake accounts on Facebook globally. We estimate that fake accounts represented approximately 3% of our worldwide monthly active users (MAU) on Facebook during Q1 2025 and 4% during Q2 2025.
This report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EU during the reporting period of H1 2025. In alignment with Meta's public announcements on 7 January 2025, we will continue to evaluate the applicability of these practices to Meta products. We will also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.