Skip to content

European Commission Launches Guidelines on Protection of Minors and Pilots Age Verification App

The European Commission has launched a prototype of an age-verification app under the Digital Services Act (DSA) and new guidelines on the protection of minors.

These aim to help children and young people enjoy the benefits of being online—like learning, being creative, and staying connected—while reducing the risks, including exposure to harmful content and behaviour.

What are the guidelines on the protection of minors?

The guidelines on the protection of minors ensure children experience high levels of privacy, safety and security on online platforms. These guidelines follow an inclusive and extensive consultation period, including with young people.

Among other things, the guidelines provide recommendations to address:

  • Addictive design: Minors are particularly vulnerable to practices that can stimulate addictive behaviour. The guidelines suggest reducing minors’ exposure to such practices, and disabling features that promote the excessive use of online services, like ‘streaks’ and ‘read receipts’ on messages.
  • Cyberbullying: The guidelines recommend empowering minors to block or mute users, ensuring they cannot be added to groups without their explicit consent. They also recommend prohibiting accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexualised or intimate content.
  • Harmful content: Some recommender systems put children in harmful situations. The guidelines give young users more control over what they see, calling on platforms to prioritise explicit feedback from users, rather than relying on monitoring their browsing behaviour. If a young user indicates they do not want to see a certain type of content, it should not be recommended again.
  • Unwanted contact from strangers: the guidelines recommend that platforms set minors’ accounts that are private by default – that is, not visible to users that are not on their friends’ list – to minimise the risk that they are contacted by strangers online.

The guidelines take a risk-based approach, similar to the Digital Services Act (DSA), recognising that different platforms may present different risks to children depending on their size, purpose, and who uses them. Platforms are expected to take steps that match the level of risk, while making sure these actions don’t unfairly limit children’s rights. Read the more about the guidelines.

New age verification app

Denmark, Greece, Spain, France and Italy will be the first EU countries to test a new prototype of the age verification app. The new tool aims to provide a user-friendly and privacy-preserving age verification method. For example, allowing users to easily prove they are over 18 when accessing restricted adult content online, without revealing any other personal information.

The app will be tested and further customised in collaboration with Member States, online platforms and end-users.

The guidelines on the protection of minors outline when and how platforms should check the age of their users. They recommend age verification for adult content platforms and other platforms that pose high risks to the safety of minors. They specify that age assurance methods should be accurate, reliable, robust, non-intrusive and non-discriminatory.

Background

The guidelines for protecting minors were created through a thorough process that included research, public feedback, expert input, and workshops in late 2024 and mid-2025. The age verification blueprint, developed in early 2025, is designed to work with the upcoming European Digital Identity Wallets planned for rollout by the end of 2026. This ensures the age verification system will fit into future digital ID tools. Both the guidelines and the blueprint build on ongoing work by the European Board for Digital Services and support the Commission’s wider efforts to protect children online through initiatives like the Better Internet for Kids Strategy and the Audiovisual Media Services Directive.

More Information


Find out more about the Guidelines on the Protection of Minors

Find out more about the age verification blueprint

Fact page on the age verification blueprint

Before you go

Thousands of people in schools, clubs and organisations around Ireland are taking part in the Safer Internet Day 2024 celebrations...

Safer Internet Day Turns 21

February 6th marks the 21st anniversary of Safer Internet Day, which has become a landmark global...

Live-Stream: SID2024 Event

What do young people think of new and emerging technology? On Safer Internet Day, tune in to…

View all
Helplines

Talk to someone

Worried about something you have seen online or concerned about your child? Childline and the National Parents Council Primary offer free advice and support service.

Childline is a support service for young people up to the age of 18.There is a 24hr telephone, online and mobile phone texting service.

1800666666
50101
Get started


The National Parents Council Primary enables and empowers parents to be effective partners in their children’s education.

01 887 4477
helpline@npc.ie

Report

Report Illegal Content

Sometimes you might unwittingly stumble across illegal online content like child abuse imagery. Always remember: you can report it and get it removed using Hotline.ie.

More on illegal content

Make a report

Hotline.ie exists to combat the distribution and proliferation of illegal content, like child sexual
abuse content, in conjunction with police and Internet Industry