Case no.
2024 - 2955
Document no.
103429
Date
29/10/2024
The Danish government’s response to the Commissions call for evidence
regarding Article 28 of the Digital Services Act.
Firstly, the Danish Government would like to express its appreciation for the chance to respond
to this consultation. Protecting minors online is a huge priority of the government to which
strong enforcement and effective implementation of the DSA is pivotal. Accordingly, we agree
with the Commission that the heart of the DSA’s approach to the protection of minors is article
28(1) and therefore urge the Commission to be ambitious in developing the guidelines as they
will be a highly appreciated addition to ensuring effective and smooth enforcement of article
28 in all Member States.
In this regard,
we are closely following the Commission’s formal proceedings against Meta and
TikTok, as these cases concern the protection of minors. It is of particular interest to examine
the interaction between the provisions on systemic risk in article 34 and 35 and article 28, which
are relevant in the case concerning Meta. We support further work along these lines in the
continued enforcement actions under the DSA, as outlined in the mandate of Commissioner-
Designate Virkkunen.
In the following, we elaborate our key priorities for the guidelines:
•
Following a risk-based approach to the protection of minors
•
Making use of default settings to ensure a baseline of safety for minors
•
Moving forward on effective and privacy-preserving age verification
•
Considering systems for selecting and displaying advertisements
Risk-based approach to the protection of minors
We encourage and fully support the Commission’s
willingness to take an inclusive approach
and allow the guidelines to have a broad scope of application across design, features, function-
ing and use of platforms that are accessible to minors.
Consideration should be given to creating guidelines that consider the fact that many online
platforms have mixed target groups and users. This entails outlining clear and practical defini-
tions of what constitutes inappropriate content for children and young people and how the
particular type of content should be categorized based on its harmful nature. As a potential
contribution hereto, a national study authorized by the Danish Parliament on the definition of
harmful content and functions is currently under development. If the guidelines categorize plat-
forms such as e.g. pornography, gaming, social media etc. and identify the typical risks of these
particular types of platforms, it will help the specific industry to identify and mitigate risks.