Europaudvalget 2019-20
EUU Alm.del Bilag 811
Offentligt
2220034_0001.png
Ref. Ares(2020)2836155 - 02/06/2020
COMBINED EVALUATION ROADMAP/INCEPTION IMPACT ASSESSMENT
This combined evaluation roadmap/Inception Impact Assessment aims to inform citizens and stakeholders about the
Commission's work in order to allow them to provide feedback on the intended initiative and to participate effectively in future
consultation activities. Citizens and stakeholders are, in particular, invited to provide views on the Commission's understanding
of the current situation, problem and possible solutions and to make available any relevant information that they may have,
including on possible impacts of the different options.
T
ITLE OF THE INITIATIVE
L
EAD
DG
RESPONSIBLE UNIT
AP N
UMBER
L
IKELY
T
YPE OF INITIATIVE
I
NDICATIVE
P
LANNING
A
DDITIONAL
I
NFORMATION
Digital Services Act package: deepening the Internal Market and clarifying
responsibilities for digital services
CONNECT F.2
Legislative instrument
Q4 2020
-
This combined roadmap/Inception Impact Assessment is provided for information purposes only. It does not
prejudge the final decision of the Commission on whether this initiative will be pursued or on its final content. All
elements of the initiative described by this document, including its timing, are subject to change.
A. Context, Evaluation, Problem definition and Subsidiarity Check
Context
The Commission announced that it intends to propose new and revised rules to deepen the Internal Market for
Digital Services, by increasing and harmonising the responsibilities and obligations of digital services and, in
particular, online platforms and reinforce the oversight and supervision of digital services in the EU.
The horizontal legal framework for digital services is unchanged since the adoption of the
e-Commerce Directive
in 2000.
The Directive harmonised the basic principles allowing the cross-border provision of services and has
been a foundational cornerstone for regulating digital services in the EU. In addition, several measures taken at
3
4
5
EU level address, through both legislative instruments and non-binding ones and voluntary cooperation,
targeted issues related to specific illegal or harmful activities conducted online by users’ of digital services and
online platforms in particular.
Technologies, business models and societal challenges are evolving constantly. The wider spectrum of digital
services is the backbone of an increasingly digitised world. This spectrum incorporates a wider range of services,
including cloud infrastructure or content distribution networks. Online platforms like search engines, market
places, social networks, or media-sharing platforms intermediate a wide spectrum of activities and play a
particularly important role in how citizens communicate, share and consume information, and how businesses
trade online, and which products and services are offered to consumers. Online advertising and recommender
1
1
2
2
3
4
5
Digital Strategy “Shaping Europe’s Digital Future” of 19 February, 2020
https://ec.europa.eu/info/sites/info/files/communication-shaping-
europes-digital-future-feb2020_en_4.pdf
The term ‘digital service’ is used interchangeably here with ‘information society service’, defined as ‘any service normally
provided for
remuneration,
at a distance, by electronic means and at the individual request of a recipient of services’ (Directive (EU) 2015/1535)
Legislation addressing specific types of illegal goods and illegal content includes: the
Market Surveillance Regulation,
the revised
audio-visual
media services directive,
the
directive on the enforcement of intellectual property rights,
the
directive on copyright in the digital single market,
the
regulation on market surveillance and compliance of products,
the
proposed regulation on preventing the dissemination of terrorist content
online,
the
directive on combatting the sexual abuse and sexual exploitation of children and child pornography, the regulation on the marketing
and use of explosives precursors
etc. The Directive on better enforcement and modernisation of EU consumer protection rules added
transparency requirements for online marketplaces vis-à-vis consumers which should become applicable in May 2022.
The Commission has also set general guidelines to online platforms and Member States for tackling illegal content online through a
Communication (2017)
and a
Recommendation (2018)
e.g. the EU Internet Forum against terrorist propaganda online, the Code of Conduct on countering illegal hate speech online, the Alliance to
better protect minors online under
the European Strategy for a better internet for children
and the WePROTECT global alliance to end child
sexual exploitation online, the Joint Action of the consumer protection cooperation network authorities, Memorandum of understanding against
counterfeit goods, the Online Advertising and IPR Memorandum of Understanding, the Safety Pledge to improve the safety of products sold
online etc. In the framework of the Consumer Protection Cooperation Regulation (CPC), the consumer protection authorities have also taken
several coordinated actions to ensure that various platforms (e.g travel booking operators, social media, online gaming platforms, web shops)
conform with consumer protection law in the EU. A package of measures was also adopted to secure free and fair elections -
https://ec.europa.eu/commission/presscorner/detail/en/IP_18_5681
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0002.png
systems are a core feature of most online platforms.
This digital transformation has brought major benefits but also challenges. For example, the online sale of
counterfeit, dangerous products or other illegal goods puts citizens at risk and harms legitimate businesses.
Digital services are exploited by certain users to spread illegal content online such as child sexual abuse material
or illegal hate speech. Infringements to property rights are also observed. Other risks are related to the systematic
abuse of digital services and their algorithmic processes for amplifying the propagation of disinformation online.
The safety of vulnerable users, and in particular children, is also a challenge. The outbreak of the COVID-19
pandemic has clearly confirmed both the importance of digital services and their vulnerabilities. Further concerns
relate to the social impact in the use of online platforms, and the opportunities and challenges they bring for
workers and individuals offering services through platforms.
To address these issues effectively and to avoid the emergence of an increasingly fragmented legal landscape in
the internal market, the Commission has committed to update the horizontal rules that define the responsibilities
and obligations of digital services. Against this context, the high-level goals of the
Commission’s
intended
proposal for a new Digital Services Act are to reinforce the internal market for digital services, to lay down clearer,
more stringent, harmonised rules for their responsibilities in order to increase
citizen’s
safety online and protect
their fundamental rights, whilst strengthening the effective functioning of the internal market to promote innovation,
growth and competitiveness, particularly of European digital innovators, scale-ups, SMEs and new entrants. This
Inception Impact Assessment sets out this intervention logic.
The issues covered in this Inception Impact Assessment are closely connected to additional measures the
Commission is exploring:
A separate Inception Impact Assessment [link] is exploring potential options for an ex-ante regulatory
instrument for very large online platforms with significant network effects acting as gate-keepers in the
EU’s internal market,
which was announced as part of the Digital Services Act package planned for end
6
2020
In the context of the announced evaluation and potential review of the fitness of EU competition rules for
the digital age, a separate Inception Impact Assessment (link) lists potential options for a new competition
tool that would complement the existing EU competition law framework, potentially applicable to all
economic sectors (including individual platform ecosystems or digital markets).
The Commission is also currently carrying out a REFIT of the General Product Safety Directive (GPSD).
The fact-finding and consultation activities for the two instruments will be closely associated to ensure
coherence of the legal framework. (link)
The Commission will also
‘look at ways of improving the
labour
conditions of platform workers’
by
launching a broader debate on working conditions in the context of the platform economy and consulting
on a series of issues already in the open public consultation launched for the purpose of the Digital
Services Act.
7
The Commission is reviewing the Code of Practice on Disinformation to address the spread of
disinformation and working towards a European Democracy Action Plan, to find ways to reduce the
possibility of manipulation in the public space.
Evaluation
Ahead of the Impact Assessment for the Digital Services Act, the Commission will conduct an evaluation of the
e-
Commerce Directive,
in line with the
’evaluate
first’ principle of the Better Regulation guidelines.
This Directive covers all information society services ranging from the very
‘backbone’
of the Internet (e.g. internet
service providers) to online intermediaries such as web hosts, cloud services and online platforms.
Its main objectives are:
1. Strengthen the single market and establish the framework conditions for digital innovations to emerge and
for swift and effective enforcement: this rests on the principles of home state control, a mechanism for
cooperation between Member States on cross-border issues, and the guarantee of the freedom of
establishment and of the freedom to provide digital services cross-border in the Union.
6
7
Commission ’s Communication of 19 February 2020 on
Shaping Europe’s Digital Future
https://ec.europa.eu/info/strategy/priorities-2019-
2024/europe-fit-digital-age/shaping-europe-digital-future_en
Platform work can be described as an activity that uses an online platform to enable organisations or individuals to access other organisations
or individuals to solve problems or to provide services in exchange for payment.
2
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0003.png
2. For the subcategory of services that intermediate third party content (e.g. internet service providers, cloud
services, web hosts, or online platforms), special provisions are aimed at allowing them to function
effectively in the internal market, by harmonising the exemption from liability for illegal content across the
single market. The underlying objective was to allow digital innovation, while also protecting users
freedom of expression.
3. Enhance trust in digital services, including by providing a high level of consumer protection and
transparency of digital services.
It is clear that these objectives remain valid. The core principles of the Directive have been the corner stone of the
internal market for digital services. Whilst they may need certain adjustments, the underpinning basis is as valid
today as it has been 20 years ago. The evaluation will be informed by a process of evidence-collection, not least
related to the legal and economic barriers emerging in the internal market for digital services. The effectiveness,
efficiency, relevance, coherence and EU added value of the measures in place will be evaluated, including in light
of more recent measures, such as the
Recommendation of 2018, on measures to effectively tackle illegal content
online,
as well as changes in the nature and diversity of digital services and risks they entail, notably with the
increased use of services provided from outside the Union, which are not covered today by the E-Commerce
Directive. It will cover the entire period, from the entry into force of the Directive until the present, and use a
comparative methodology, building on previously published implementation reports (
2003
and
2012
), and on more
recent and ongoing studies. The evidence collection will focus in particular on the past 10 years, most prominently
marked by a rapid evolution of information society services, new forms of online behaviours and case law.
Problem the initiative aims to tackle
The rapid and widespread development of digital services has been at the heart of the digital transformation,
including the rise of e-commerce and unprecedented opportunities for information to be freely shared online.
While there is a broad consensus on the benefits of this transformation,
the scale of problems arising from
digital services is substantially different
from 20 years ago. In particular, the impact of online sales of
counterfeit, dangerous or unauthorised products or other illegally traded goods (including those imported from
traders based outside of the EU) is increasing, and so is the online dissemination of illegal content such as hate
speech or child sexual abuse material.
An
entirely new set of issues
has also emerged with the scale of information intermediated online:
recommender systems and online advertising play an important part in optimising
consumers’
access to
information. At the same time, services are also abused to disseminate harmful content such as online
disinformation (which is not,
per se,
illegal), exploiting algorithmic systems to amplify the spread of the messages.
These new challenges have an important bearing on fundamental rights online, and the appropriate division of
responsibilities between the relevant stakeholders, including private and public actors.
.
.
The relative scale and impact of these issues is
particularly important where the most prominent online
platforms structure at scale information flows online,
having become
de facto
‘public spaces’
in the online
world.
While a range of targeted, sector-specific interventions have been taken at EU level, significant gaps and legal
burdens remain unaddressed.
The Impact Assessment will analyse in detail a series of problems:
1. Fragmentation in the Single Market and need for reinforced cross-border cooperation
In response to the increasing role of digital services in the online trade in or dissemination of illegal goods and
content, Member States are increasingly passing laws with notable differences in the obligations imposed on
digital services, in particular online platforms, and with a variety of different enforcement mechanisms. This
creates a fragmentation of the single market that can negatively affect EU citizens and businesses in the absence
of harmonised rules and obligations. It also entails a lack of legal clarity and certainty for digital services in the
internal market, and is likely to be less effective in achieving the underlying public policy objectives.
Existing cross-border cooperation mechanisms appear insufficient for addressing local problems posed by the use
of online platforms. Stronger, more systematic and agile cooperation between authorities on the basis of common
rules for addressing illegal goods and content would strengthen their mutual trust and the effective enforcement.
Options for addressing harmful, but not necessarily illegal content, would also be in scope, while respecting the
important distinction between the two. This would help better protect users, improve the functioning of the single
market, especially if combined with a capacity to gather all relevant information from platforms. Experience from
existing cooperation structures (such as
BEREC, CPC Network, RAPEX)
will be taken into consideration, as well
as the new competences introduced under the AVMSD for audiovisual regulatory authorities and their cooperation
mechanism under the European Regulators Group for Audiovisual Media Services (ERGA)
.
The underlying
regulatory and market evolutions pose risks to the competitiveness of the internal market, by restricting the
3
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0004.png
freedom of establishment of digital businesses and their freedom to provide cross-border services. Facing
potentially 27 different regimes across the EU, an innovative digital service would struggle to benefit from the
Single Market to grow and scale up: only the largest platforms can absorb the costs of diverging rules and
procedures. This is also ineffective in protecting Europeans in a consistent and comprehensive manner across the
single market.
2. Risks for the safety of citizens online and the protection of their fundamental rights
Online platforms continue to play a significant role in the spread illegal goods, illegal services and content online.
In the absence of clearly defined responsibilities at EU level for digital services in general, and platforms in
particular, the level of protection for
citizens’
safety online is not consistent across all services. When platforms do
take steps to address illegal behaviour, they can face uncertainty under the existing legal framework (notably for
voluntary measures to detect illegal content).
At the same time, protections of
users’
rights
including the freedom to receive or impart information
are not
always sufficiently robust across the board.
In many instances, there is also a lack of accountability in the private decisions taken by online platforms, and the
current legal framework does not allow any scrutiny of how platforms shaping information flows online, including
the role played by associated online advertising services. Further strengthening of the cooperation between
service providers and authorities is also needed to ensure public policy objectives on safety and security.
In addition, while the e-Commerce Directive does not currently apply to services established in third countries,
services without legal establishment in the EU are increasingly gaining importance in the EU, be it in the field of e-
Commerce or in social media services. Today, these remain effectively unregulated.
3. Significant information asymmetries and ineffective oversight across the single market
There is currently a lack of oversight over digital services in the current legal framework, with significant
information asymmetries between the services and their users (citizens and businesses), as well as with regards
to public authorities. Transparency reports published by online platforms on measures taken to counter the spread
of illegal goods or content are generally voluntary, partial, and difficult to compare across services. In addition,
when services take measures against
“harmful”
content, which is not per se illegal, such measures, as well as
their impacts
on reducing individual and social harms, but also on freedom of expression -, are difficult to
scrutinise. There is also no structured, legally binding cooperation or dialogue for emerging issues, such as
unforeseen issues relating to
“harmful”
content, or unexpected forms of manipulation of
platforms’
services. This
accountability gap also concerns algorithmic processes used by online platforms
e.g. recommender systems,
content moderation tools, or ad placements.
Basis for EU intervention (legal basis and subsidiarity check)
The intervention addresses the freedoms of establishment and to provide services and the proper functioning of
the Single Market for digital services. As such, the legal basis considered likely would be Article 114 of the Treaty
of the Functioning of the European Union and, potentially, Articles 49 and 56.
Given the fundamentally cross-border nature of many digital services and of the risks and opportunities they bring,
the intervention is best taken at EU level. The objectives cannot be effectively reached by any Member State
acting alone.
B. Objectives and Policy options
The
general objective
is to provide for a modern legal framework for digital services, strengthening the Digital
Single Market and ensuring that digital service providers present in the Union act responsibly to mitigate risks
emanating from the use of their service, respecting Union rights and values, and protecting fundamental rights.
This initiative aims at establishing a balanced and effective governance online and at clarifying roles, procedures
and responsibilities.
In the baseline scenario,
the Commission would make no changes to the current legal framework, in particular to
the E-Commerce Directive. The Commission would monitor the take-up of the
Commission’s
Recommendation on
measures to effectively tackle illegal content online
,
and the transposition of the
Directive on Copyright in the
Digital Single Market,
the recently amended
Audiovisual Media Services Directive
and the
Terrorist Content
Regulation,
once adopted.. Further action would focus in particular on more self-regulatory actions, which are
naturally limited to some services participating on a voluntary basis, and with limitations regarding the
enforcement or monitoring of the results. Courts would continue to interpret the obligations of new digital services
against the framework of existing EU law, in particular the provisions for hosting services in Article 14 of the E-
Commerce Directive.
4
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0005.png
In the absence of further EU legislation, legal fragmentation is likely to increase. A patchwork of national
measures would not effectively protect citizens, given the cross-border and international dimension of the issues.
The proliferation of illicit goods sold online and the dissemination of illegal content would likely continue, and no
harmonised safeguards against over-removal of legal content would exist. Barriers for promising European
companies to scale up in the internal market would increase, reinforcing the stronghold of large online platforms,
and reducing the competitiveness of the internal market.
Other policy options will be impact-assessed following the general lines presented below. The approach
presented in option 3 is complementary to the others.
1. A limited
legal instrument would regulate online
platforms’ procedural obligations,
essentially
making the horizontal provisions of the 2018
Recommendation
binding .
This would build on the scope of the e-Commerce Directive, focusing on services established in
the EU.
It would lay out the responsibilities of online platforms with regard to sales of illegal products and
services and dissemination of illegal content and other illegal activities of their users. They would
include proportionate obligations such as effective notice-and-action mechanisms to report illegal
content or goods, as well as effective redress obligations such as counter notice procedures and
transparency obligations.
This option would neither clarify nor update the liability rules of the e-Commerce Directive for
platforms or other online intermediaries
2.
A more comprehensive legal intervention,
updating and modernising the rules of the e-Commerce
Directive, while preserving its main principles.
It would clarify and upgrade the liability and safety rules for digital services and remove
disincentives for their voluntary actions to address illegal content, goods or services they
intermediate, in particular in what concerns online platform services. Definitions of what is illegal
online would be based on other legal acts at EU and national level.
It would harmonise a set of specific, binding and proportionate obligations, specifying the different
8
responsibilities in particular for online platform services. In addition to a basic set of generally
applicable obligations, further asymmetric obligations may be needed depending on the type,
size, and/or risk a digital service presents, as well as a cooperation framework and due process
requirements for crisis situations.
Obligations could include:
o
harmonised obligations to maintain
‘notice-and-action’
systems covering all types of
illegal goods, content, and services, as well as
‘know your customer’ schemes for
commercial users of marketplaces
rules ensuring effective cooperation of digital service providers with the relevant
authorities
and ‘trusted flaggers’ (e.g. the INHOPE hotlines for a swifter removal of child
sexual abuse material) and reporting, as appropriate
risk assessments could be required from online platforms for issues related to exploitation
of their services to disseminate some categories of harmful, but not illegal, content, such
as disinformation
more effective redress and protection against unjustified removal for legitimate content
and goods online
a set of transparency and reporting obligations related to the these processes
o
o
o
o
It would also explore transparency, reporting and independent audit obligations to ensure
accountability with regards to algorithmic systems for (automated) content moderation and
recommender systems, as well as online advertising and commercial communications, including
political advertising and micro-targeting aspects, beyond personal data protection rights and
obligations. Such measures would enable effective oversight of online platforms and would
support the efforts to tackle online disinformation. Issues related to legal clarity around smart
contracts would also be considered.
8
This would be coherent with the rules agreed in the AVMSD, which introduce some obligations for video-sharing platforms, including against
hate speech, and with the 2018 Recommendation.
5
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0006.png
It would explore extending coverage of such measures to all services directed towards the
European single market, including when established outside the Union, with a view to identifying
the most effective means of enforcement.
The instrument would also establish dissuasive and proportionate sanctions for systematic failure
to comply with the harmonised responsibilities or the respect of fundamental rights.
3.
Options for creating an effective system of regulatory oversight, enforcement and cooperation
across Member States, supported at EU level.
These options, in complement to the previous options,
would aim to reinforce the updated set of rules (as per Option 1 or 2 above). They should provide for an
effective EU-wide governance of digital services through a sufficient level of harmonisation of rules and
procedures. Based on the country-of-origin principle, these options would allow Member States’
authorities to deal with illegal content, goods or services online, including swift and effective cooperation
procedures for cross-border issues in the regulation and oversight over digital services. Public
authorities’
capabilities for supervising digital services would be strengthened including through appropriate powers
for effective and dissuasive sanctions for systemic failure of services established in their jurisdiction to
comply with the relevant obligations, potentially supported at EU level. Options for effective judicial
redress would also be explored.
For all the options, coherence with sector-specific regulation
e.g. the
Directive on Copyright in the Digital Single
Market
,
the revised
A
udiovisual Media Services Directive,
the proposal for a Regulation on Terrorist Content
Online
– as well as the EU’s international obligations will be ensured.
The intervention would not seek to modify
rules established under these recently adopted or proposed acts, but complement these rules by renewing the
horizontal rules for digital services.
C. Preliminary Assessment of Expected Impacts
The impact assessment will assess, for each option, the following types of likely impacts:
Likely economic impacts
The expected economic impacts are mainly related to benefits deriving from the removal of legal barriers in the
internal market for digital services, and greater legal certainty, to be assessed against the cost of harmonised
rules for such services across the EU. Particular attention will be paid to the impacts on innovative European
SMEs, and their ability to scale up across borders in the internal market of the EU. The competitiveness of digital
services in the single market will be a decisive factor in ensuring a swift economic recovery following the COVID-
19 crisis. A strong digital services sector will to a large extent drive growth and enable a series of subsequent
services, including the sale of goods to thrive. There would also be positive indirect impacts for businesses
engaged in legitimate trade in goods and services, including quality media.
The impacts on service providers from third countries would also be assessed, given that those would be in scope
of Option 2.
The assessment of economic impacts would consider the size of the targeted service providers, the capacity of
the provider, and, will be assessed against costs for mechanisms already implemented today.
Likely social impacts
Importantly, digital services are fundamental means of communication
from offering the infrastructure, to
intermediating flows of information. Greater trust that citizens’ safety is protected will also ensure greater uptake of
digital services. The analysis will also assess impacts on the protection of consumers against illegal goods sold
online, a reduction of hate speech online, better protection of minors as well as emerging systemic harms with an
impact on our society and our democratic systems. The ability for public authorities to effectively ensure that the
law is applied online will also be taken into account.
Likely environmental impacts
Environmental impacts are expected to be relatively marginal for all options compared to the baseline. The
initiative is not expected, following an initial assessment, to increase environmental footprint of digital services, nor
the absolute volumes of e-commerce compared to the baseline, even though the shipping of illegal goods is
expected to decrease significantly. However, given the variety of sectors online platforms are active in (tourism
activities, sales of goods, transport, etc.), the environmental implications could also be diverse and the Impact
Assessment would need to analyse them in more detail.
Likely impacts on fundamental rights
A range of fundamental rights are affected by the regulation of intermediaries, as confirmed by case law of the
CJEU and the ECHR. This concerns in particular, but not exclusively, the freedom to receive and impart
information and ideas without interference by public authority and regardless of frontiers and the right to an
6
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0007.png
effective remedy. The options will be assessed in particular against the opportunity to design at EU level a system
with the appropriate and effective checks and balances to ensure that fundamental rights are protected online.
The impact assessment will pay particular attention to direct effects and incentives which could lead to the
restriction of any fundamental right. Where there is a risk that such negative impacts could emerge, appropriate
responses will be designed to ensure effective redress and a fair balance for the protection of rights online, in
compliance with the Charter of Fundamental Rights of the European Union and international standards.
Other fundamental rights to be considered relate in particular to the protection of personal data and privacy, non-
discrimination and gender equality, freedom of assembly, the rights of the child, the right to an effective remedy,
the freedom to conduct a business and the protection of intellectual property.
Likely impacts on simplification and/or administrative burden
Whereas all options are expected to lead to some costs on public authorities for smooth enforcement and
cooperation, it is likely that these costs will be counterbalanced by the significant efficiency gains in the
enforcement of the law across the Union. This will be further assessed in the Impact Assessment.
D. Evidence base, Data collection and Better Regulation Instruments
Impact assessment
Following the evaluation of the e-Commerce Directive, an Impact Assessment will be conducted to further define
the problem analysis and the policy options and to compare their impacts. The Impact Assessment is expected to
be finalised and submitted to the Regulatory Scrutiny Board of the European Commission in the second half of
2020.
Evidence base and data collection
The evaluation and the impact assessment will further build on detailed evidence gathered over the past years , in
particular concerning the legal assessment of current implementation of the e-Commerce Directive and evidence
of emerging legal fragmentation. In addition, more granular data is being collected regularly on specific types of
illegal content and goods in the context of the structured dialogues and voluntary cooperation coordinated by the
Commission on several policy areas such as unsafe products, illegal hate speech, child sexual abuse material
(and related cooperation between law enforcement, hotlines and industry), counterfeit products, dissemination of
terrorist content, amongst others. Evidence will be also gathered through the parallel initiative on the evaluation
and revision of the General Product Safety Directive.
The Commission will further launch targeted studies in particular in relation to the economic analysis underpinning
the impact assessment, the state of the art in content moderation technologies, as well as evolution of certain
digital services.
Consultation strategy
The evaluation and the impact assessment will be underpinned by a broad stakeholder consultation, with a
threefold objective:
-
-
-
Gather
stakeholders’ views on the current and emerging problems in the environment of digital services
Gather evidence and concrete data underpinning the problems
Collect informed views on the potential policy approach, options, and perceived impacts.
9
The consultation strategy will in particular aim at collecting views from the most concerned stakeholders, including
users of online platforms (and organisations representing their interests), digital services and intermediaries
ranging from internet service providers, to cloud infrastructure services, content distribution networks, cloud-based
software services, domain names service providers, etc., third party businesses involved in the ecosystem (such
as providers of content moderation tools, users of online platforms, payment providers, data intermediaries,
advertisers, content creators, right holders, brands, etc.) and organisations representing them; civil society
representing different interests such as the protection of digital rights, consumer protection, defence of vulnerable
groups and victims, social partners, and other business or employee associations, individual
professionals/workers, as well as national authorities and academia, and individual citizens.
The consultation strategy will include an online open public consultation, covering a wider spectrum of issues and
problems than the scope of this Inception Impact Assessment, as well as a series of targeted consultation tools,
including expert workshops and an exchange of experience with Member States.
Will an Implementation plan be established?
9
Some of the references are already presented in more detail in annexes of the Impact Assessment conducted in 2018
https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-swd-408_en.pdf
7
EUU, Alm.del - 2019-20 - Bilag 811: Notat og høringssvar vedr. Europa-Kommissionens køreplan for den kommende retsakt for digitale tjenester
2220034_0008.png
Pending further analysis of the options and their complexity in the impact assessment, an implementation plan
could be developed.
8