Commission President Ursula von der Leyen announced a new age-verification app to strengthen online child protection. 90 per cent of EU citizens back increased action. How has Europe addressed minors’ online safety so far? Ask the Euronews AI chatbot.
97 per cent of young people are online daily, with 65 per cent relying on social media for their main news source. Among 13 to 17-year-olds, 78 per cent check their devices hourly. 9 to 15-year-olds spend up to 3 hours on social platforms, and 25 per cent admit to smartphone addiction, according to the 2025 Parliament's report on an EU-wide minimum age for social media.
The EU already took steps to safeguard minors online through initiatives such as the Digital Market Act, the Strategy for a Better Internet for Kids, and the Action Plan Against Cyberbullying.
Key regulations, including the strengthened Digital Services Act, now include specific guidelines to protect children in the digital space.
None of these solutions imposes a minimum age for accessing social media, online platforms, and AI tools.
In 2025, the European Parliament pushed for an EU-wide age limit on social media and restrictions on addictive features like infinite scrolling and engagement-driven recommendations.
Commission President Ursula von der Leyen announced an age-verification app last week. The goal is to have a minimum age requirement for accessing social media while prioritising user privacy.
An expert panel is currently advising the Commission on an EU-wide strategy for child safety online to avoid a confusing patchwork of national rules. Its recommendations will come by summer 2026.
Member states are outpacing Brussels. France has already approved a 15-year social media ban. Spain, Austria, Greece, Ireland, Denmark and the Netherlands are gearing up for urgent political action.
Do you want to know what the Commission has done so far to protect children online?Ask the Euronews AI chatbot!