On July 14, 2025, the European Commission (EC) published its guidelines (the Guidelines) on the protection of minors online. These Guidelines, which were initially released for consultation in May 2025, provide direction for online platforms on the steps they can take to comply with their duties to protect the privacy, safety, and security of minors under the EU’s Digital Services Act (DSA). They focus on assessing and mitigating platform risks, the appropriate use of age assurance, and measures that should be taken to protect minors from manipulative commercial practices.
Why Are the Guidelines Significant?
The DSA imposes a high-level requirement for online platforms to put in place “appropriate and proportionate measures” to ensure an increased level of privacy, safety, and security for minors. The Guidelines expand on this requirement by identifying the risks to minors commonly presented by online services, and by outlining various measures that the EC considers will help protect minors against these risks.
Although the Guidelines are not legally binding, there are strong incentives for providers of online platforms to take into account their recommendations when designing and modifying services. In particular, the EC notes that the Guidelines will provide a “significant and meaningful benchmark” when determining whether online platforms have complied with their obligations under the DSA, with enforcement bodies encouraged to “draw inspiration” from them when applying and interpreting the law.
Recommendations for Online Platforms
The Guidelines build on the EC’s earlier consultation, which we discussed in detail here. The key measures the EC now recommends providers of online platforms take include:
- Conducting a Risk Review. The EC notes that, due to the wide variety of online platforms, safety measures will need to be tailored, with some being more appropriate for certain platforms than others. The Guidelines therefore recommend that providers carry out a risk assessment that considers i) the likelihood that minors will access their service; ii) the actual or potential impact on minors that the platform may pose; iii) any current or proposed mitigating measures taken by the provider to address those impacts; iv) metrics that allow the provider to monitor the effectiveness of such measures; and v) the potential effects those measures have on minors’ rights. The EC also refers to a matrix of risks to consider based on five categories: conduct, content, contact, consumer, and cross-cutting risks, known as the 5C typology of online risks to minors. When conducting this assessment, providers should primarily consider the best interests of the minor, in line with the principles set out in the Charter of Fundamental Rights of the European Union and United Nations Convention on the Rights of the Child, as well as stakeholder views (including parents and minors) and the latest insights from scientific and academic resources. Providers of platforms have considerable flexibility over the format of their risk review and could therefore look to augment or make use of risk assessments carried out under other legal regimes, such as the UK’s Online Safety Act.
- Age Assurance Measures. The Guidelines note that online platform providers should adopt a risk-based approach to the implementation of age assurance measures. Age assurance measures should only be implemented where necessary and proportionate to the risks identified. The EC recommends that providers should publish any assessment they conduct on this point, in addition to information regarding the age assurance solutions implemented and their overall effectiveness. This should include a description of the methods used and a summary of the performance metric applied to assess their reliability such as false positive and false negatives rates, as well as accuracy and recall rates. The EC encourages providing users with a choice of age assurance methods to ensure users are not excluded.
- Default Settings and AI Features. Default settings should be implemented to prevent unwanted contact by individuals seeking to harm minors. In addition, the Guidelines note that when it comes to interface design, any AI features integrated into an online platform should be turned off by default, and minors should not be encouraged or enticed to use them. AI features should only be offered on online platforms accessible to minors following an evaluation of their potential risks, and, post-deployment, minors should be reminded that i) interactions with AI features differ from human interactions, and ii) information provided may be factually inaccurate and misleading. Such warnings should be clearly visible, written in child-friendly language, and accessible throughout the minor’s interaction with the AI features.
Next Steps
Online platform providers should carefully review the Guidelines to understand the recommended measures and assess their current practices against these standards. The EC notes that the Guidelines will be reviewed regularly, such that providers will want to implement a process to monitor updates.