"Today [14 July], the Commission presented guidelines on the protection of minors, as well as a prototype age verification app, under the Digital Services Act. These measures aim to ensure that children and young people can continue to enjoy the opportunities of the online world - such as education, creativity and communication - while minimising risks, including exposure to harmful content and behaviour," the institution said in a statement.

As for the mobile app, it is a prototype that is “easy to use and protects privacy, setting a standard of excellence in online age verification,” according to the EU executive.

Thus, “users must prove that they are over 18 to access restricted content, while maintaining full control over other personal information, such as their exact age or identity,” the institution explains, emphasising that “no one will be able to track or reconstruct the content viewed individually.”

The application (still in prototype form) will now be tested and adapted with the collaboration of Member States, online platforms and end users, starting with Denmark, Greece, Spain, France and Italy.

In an interview with Lusa last month, European Commissioner for Home Affairs Magnus Brunner announced the creation of this mobile application, describing it as “a good example” of how the European Commission “will not hesitate” to ensure compliance with rules for the protection of minors on the Internet.

Recalling that “minors are exposed to a multitude of risks online”, Magnus Brunner stressed that the institution will “pay particular attention to better protection against these threats”, namely through measures to protect children’s rights and safety under the new Digital Services Act and investigations into platforms.

This application, based on the same technology as the EU digital wallet, will allow online service providers to verify that users are 18 years of age or older without compromising their privacy, thereby strengthening the protection of minors in the digital sphere.

The aim is to develop a harmonised European age verification solution that preserves privacy and is available by 2026.

Regarding the recommendations published, the EU executive suggests that, in the EU, minors” accounts should be private by default and not visible to anyone not on their friends list to minimise the risk of contact by strangers, greater control over what is viewed to avoid harmful content, and reduced exposure to addictive behaviour in features such as messaging.

To prevent cyberbullying, the institution wants to give minors the power to block or mute users and not be added to groups, and wants to ban downloading or screen capture to prevent the unwanted dissemination of intimate or sexualised content.

“Platforms must ensure that the measures adopted are appropriate and do not unjustifiably restrict children’s rights,” Brussels also urges.

The guidelines were developed based on research and contributions collected between October 2024 and June 2025.

Since the end of last August and after a period of adaptation, the EU has become the first jurisdiction in the world with rules for digital platforms, which are now required to remove illegal and harmful content.