Annual moderation report
Euronews
Intermediary service providers make available to the public, in a machine-readable format and in an easily accessible manner, at least once a year, clear and easily understandable reports on any content moderation activities they have engaged in during the relevant period.
The goal is to ensure transparency in our content moderation practices, protect freedom of expression, and combat illegal content.
Covered period: [01/01/2025 – 12/31/2025]
Report publication date: [03/05/2026]
1. General Usage Data
- Average number of monthly active users on the Euronews website in the EU: 12.5 million (In accordance with Art. 24(2) of the DSA)
2. Reports received
- Total number of reports received: [77,000]
- Source of reports:
-
- Users (via form / report button): [42,000]
-
- Artificial Intelligence: [35,000]
- Reasons cited (typology of moderated content):
Note: The distribution of reasons is only available from 12/18/2025, the following distribution is presented as a percentage, based on 2,600 comments.
-
- Malicious or illegal speech: [29%]
-
- Violence: [23%]
-
- Cyberviolence: [16%]
-
- Negative impact on public debate or elections: [14%]
-
- Risk to public safety: [4%]
-
- Hate speech: [3%]
-
- Others (specify): [10%]
4. Measurestaken following reports:
- Number of contents removed or made inaccessible: [75,000]
- Number of contents maintained after review: [2,000]
- Average time to process reports: [30 days]
- Moderation methods applied:
-
- Removal of the comment: [33,000]
-
- Restriction (masking, visibility limitation): [42,000]
-
- Warning to the author: [0]
-
- Account suspension: [0]
5. Proactive Measures
- Use of automated tools: [Yes]
- Description: All comments are initially moderated by automated tools. The automation tool, OpenAI, assesses the toxicity level of the comment as well as the likelihood of it being spam. The publication of the comment will depend on the result of this evaluation. Comments that include images, links, and/or a phone number are also hidden by default.
Additionally, a list of prohibited words is implemented in each language, automatically placing any comment on hold for evaluation by a moderator (human action).
- Proportion of removals carried out automatically: [44%]
6. Appeals and Disputes
- Total number of appeals submitted by users: [1]
- Outcome of appeals:
-
- Decision upheld (content remains removed): [1]
-
- Decision overturned (content restored): [0]
-
- Processing: [0]
- Average processing time for appeals: [34 days]
7. Cooperation with Authorities and Trusted Reporters
- Official requests received from judicial or administrative authorities: [0]
- Requests from trusted flaggers: [0]
8. Internal Resources and Policies
- Staff dedicated to moderation: [1 FTE]
- Specific training for moderators:
-
- Moderators are trained to use our moderation platform. A demonstration video and links to tutorials are provided. We explain the automatic moderation rules to them.
-
- With 13 languages covered, different moderators have been appointed in each editorial team.
- In case of appeal, we consult the moderators to determine whether a comment should be moderated or not. The final decision lies with the editorial teams.
- Commitments to improve transparency, speed, and fairness in moderation: ongoing training and awareness.
9. Conclusion and Commitments
- Analysis of trends observed during the period: increase in the number of hostile comments, leading to a higher proportion of rejected comments.
- In light of the number of reports observed, we have modified the rule regarding the number of reports needed before hiding comments (from 2 to 3 reports), in order to ensure freedom of expression.