EUR-Lex Access to European Union law

Back to EUR-Lex homepage

This document is an excerpt from the EUR-Lex website

Document 32023H2425

Commission Recommendation (EU) 2023/2425 of 20 October 2023 on coordinating responses to incidents in particular arising from the dissemination of illegal content, ahead of the full entry into application of Regulation (EU) 2022/2065 of the European Parliament and of the Council (the Digital Services Act) (notified under document C(2023) 7170)

C/2023/7170

OJ L, 2023/2425, 26.10.2023, ELI: http://data.europa.eu/eli/reco/2023/2425/oj (BG, ES, CS, DA, DE, ET, EL, EN, FR, GA, HR, IT, LV, LT, HU, MT, NL, PL, PT, RO, SK, SL, FI, SV)

ELI: http://data.europa.eu/eli/reco/2023/2425/oj

European flag

Official Journal
of the European Union

EN

Series L


2023/2425

26.10.2023

COMMISSION RECOMMENDATION (EU) 2023/2425

of 20 October 2023

on coordinating responses to incidents in particular arising from the dissemination of illegal content, ahead of the full entry into application of Regulation (EU) 2022/2065 of the European Parliament and of the Council (the ‘Digital Services Act’)

(notified under document C(2023) 7170)

THE EUROPEAN COMMISSION,

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 292 thereof,

Whereas:

(1)

The world is witnessing an unprecedented period of conflict and instability. With Russia’s war of aggression against Ukraine and with the terrorist attack by Hamas in Israel. With the wide reach of social media, violence and war increasingly reverberate online in the Union. This has had as its consequence an unprecedented increase in illegal and harmful content being disseminated online, including coordinated actions to spread disinformation and misinformation throughout the Union in relation to such international crises.

(2)

Online platforms, in particular, play an important role in the dissemination of information throughout the Union. On one hand, online platforms constitute key channels of communication for Union citizens and can provide meaningful information to governments and public authorities. They facilitate the public debate and the dissemination of information, opinions, and ideas to the public, and influence how citizens obtain and communicate information online. On the other hand, online platforms can be misused as a means to disseminate and amplify illegal or harmful content online.

(3)

With Regulation (EU) 2022/2065 of the European Parliament and of the Council (1), the Union has laid down ground-breaking rules to secure its online information environment, protecting vital informational freedoms, especially in times of conflict, but also requiring effective responses to the dissemination of illegal content online and threats to civic discourse, elections and public security. That Regulation contributes to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter of Fundamental Rights of the European Union, are effectively protected (2).

(4)

The Regulation does so, in particular, by imposing specific due diligence obligations tailored to specific categories of intermediary services providers, and by putting in place a governance structure to ensure cooperation and coordination between the competent authorities of the Member States and the Commission in monitoring and enforcing those obligations, including the possibility to drawing up crisis protocols pursuant to Article 48.

(5)

While Regulation (EU) 2022/2065 will only apply in full as from 17 February 2024, it already applies to the providers of online platforms and of online search engines which the Commission, on 25 April 2023, designated as very large online platforms and as very large online search engines pursuant to Article 33(4) of that regulation (3). While the Member States are only obliged to designate their Digital Services Coordinators and other national competent authorities responsible for the monitoring and enforcement of Regulation (EU) 2022/2065 by 17 February 2024 (4), the Commission may already deploy the enforcement powers entrusted to it under Section 4 of Chapter IV of that regulation in respect of the very large online platforms and of very large online search engines it designated on 25 April 2023 (5).

(6)

However, the effective monitoring and enforcement of Regulation (EU) 2022/2065 by the Commission in relation to those designated very large online platforms and very large online search engines requires the assistance of and active cooperation with Member State national authorities. In several instances, the provisions of Section 4 of Chapter IV of that regulation explicitly requires the Commission to cooperate with the European Board for Digital Service (‘the Board’) (6), Digital Service Coordinators, and other national competent authorities which the Member States plan to entrust with the monitoring and enforcement of that regulation in their territory.

(7)

The fact that several Member States have not yet designated their Digital Service Coordinators and that the Board has not yet been constituted complicates the monitoring and enforcement of that regulation by the Commission, prior to the full entry into force thereof, in relation to designated very large online platforms and very large online search engines to which Regulation (EU) 2022/2065 already applies (7). Nevertheless, the Commission is committed to ensure the full effectiveness of that regulation in relation to providers of such services.

(8)

By the date of adoption of this Recommendation, less than 10 % of Member States have already formally appointed their Digital Services Coordinator. In many Member States, however, existing regulatory authorities have been preliminarily identified to assume the role of Digital Services Coordinator and the national legislative processes have been initiated. To that end, the Commission encourages the Member States, until the governance structure foreseen by Regulation (EU) 2022/2065 is fully in place, to appoint an independent authority to be part of an informal network of prospective Digital Services Coordinators, as their role is essential to identify and tackle incidents, in particular arising from the dissemination of illegal content, posing a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof, including those which risk leading to a serious threat to public security or public health in the Union or in significant parts of it. They are encouraged to meet regularly among themselves and with the Commission in an informal network to discuss such incidents arising from the dissemination of illegal content disseminated on very large online platforms and of very large online search engines, to which that regulation already applies. Such incidents may include, in particular, the dissemination of illegal content in relation to international conflicts, acts of terrorism, public health emergencies, electoral processes, etc.

(9)

The Commission also promotes the convening of specific meetings in response to an incident to achieve an agile, coordinated and proportionate response in light of the application of Regulation (EU) 2022/2065 by providers, as well as among Union institutions and Member States, to streamline communication in urgent situations and to allow for widespread situational awareness.

(10)

The Member States are also encouraged to assist the Commission in its task of monitoring and enforcement of Regulation (EU) 2022/2065 in relation to designated very large online platforms and very large online search engines. In this context, the Member States are encouraged to gather evidence on the dissemination of illegal content through very large online platforms and very large online search engines on their territory and to share that evidence with the Commission so that it can properly and swiftly respond to such content.

(11)

Regulation (EU) 2022/2065 does not determine whether a particular type of content qualifies as illegal content. The unlawfulness of content is determined by national laws or, where harmonised, by European rules. Several acts of Union law provide for a legal framework in respect of certain particular types of illegal content that are presented and disseminated online and harmonise what should be considered illegal across the Union. In particular, Directive (EU) 2017/541 of the European Parliament and of the Council (8) establishes minimum rules concerning the definition of criminal offences and sanctions in the area of terrorist offences, offences related to a terrorist group and offences related to terrorist activities, as well as measures of protection of, and support and assistance to, victims of terrorism.

(12)

In addition, Regulation (EU) 2021/784 of the European Parliament and of the Council (9) defines specifically what constitutes terrorist content online, namely material that incites the commission of a terrorist offence, glorifies terrorist acts, advocates for the commission of such offences, solicits to commit or contribute to, or participate in, activities related to terrorist offences, provides instruction on the making of several types of weapons for the purposes of terrorism or constitutes a threat to commit a terrorist offence. It also provides the legal framework for Member States to send removal orders to hosting service providers, obliging the removal of the content within one hour. It further requires hosting service providers to implement specific measures to prevent the exploitation of their services if exposed to terrorist content.

(13)

Similarly, Council Framework Decision 2008/913/JHA (10) requires Member States to criminalise several intentional conducts related to public incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin. It also requires Member States to criminalise intentional conduct condoning, denying or grossly trivialising crimes of genocide, crimes against humanity, war crimes and crimes against peace, directed against a group of persons or a member of such a groups defined by reference to race, colour, religion, descent or national or ethnic origin when the conduct is carried out in a manner likely to incite to violence or hatred against such a group or a member of such a group.

(14)

The Commission further recalls that it is already possible for competent national authorities to issue injunctions against intermediary service providers whose services are being used to disseminate illegal content online. In the current context, it is of crucial importance that the competent national authorities proceed swiftly in identifying such illegal online content and issue removal orders on the basis of their national systems. Article 9 of Regulation (EU) 2022/2065 makes clear that such orders can be issued on a cross-border basis. In view of the risk of incidents, it is of primary importance that the competent authorities collect all necessary evidence to allow effective measures against the amplification of illegal online content regarding the often horrendous crimes and make use of the powers conferred to them by the different instruments of Union law to tackle illegal content.

(15)

The multiplicity of national and Union legislation and different forms of coordination in relation to illegal content increases the need to ensure coordination between Member States in the phase leading up to the full application of Regulation (EU) 2022/2065. Taking swift and coordinated action is key to prevent illegal content, and in particular terrorist content and illegal hate speech, from circulating online, including by going viral. When action taken at national level to act against the amplification of illegal content online is uncoordinated, that may increase the risk of legal fragmentation and uncertainty, and increase friction and response times. Furthermore, as recognised by Regulation (EU) 2022/2065, the Commission is better placed to enforce that regulation in relation to providers of very large online platforms and of very large online search engines. With this in mind, it is desirable that Member States act in a coordinated manner in support of the eventual enforcement actions that the Commission may take when exercising its powers set out in Section 4 of Chapter IV of Regulation (EU) 2022/2065.

(16)

The Commission further recalls that several voluntary cooperation frameworks exist to address the dissemination of illegal content online.

(17)

Taking swift and coordinated action in crisis situations is key to prevent illegal content, and in particular terrorist content and illegal hate speech, from disseminating online virally. The EU Crisis Protocol, developed in 2019 in the context of the EU Internet Forum, and updated in 2023, provides for a voluntary mechanism for a coordinated and rapid cross-border response by online services providers and law enforcement to a suspected crisis in the online space, stemming from a terrorist or a violent extremist act. The EU Crisis Protocol establishes procedures, roles and responsibilities of key actors, in particular to prevent disruption of investigations and ensure evidence gathering and is based on voluntary cooperation among the EU Internet Forum members. Member States can activate the Protocol, in consultation with Europol’s EU Internet Referral Unit (IRU). The EU IRU takes a leading role in the coordination between national law enforcement authorities and online service providers. Preservation of removed content is also key to allow for reinstatement of unduly removed content and protecting fundamental freedoms.

(18)

In the context of the Code of conduct on countering hate speech online, major social media platforms, some of which have been designated very large online platforms under Regulation (EU) 2022/2065, have committed to assess and if necessary remove hate speech content notified to them within the majority of cases 24 hours; their compliance is assessed by a network of trusted flaggers. The Commission and the signatories are currently reviewing the Code of conduct, also in the context of the entry into application of Article 45 of Regulation (EU) 2022/2065, to introduce commitments that can help mitigate systemic risks and anticipate threats of waves of illegal hate speech before content has gone viral online.

(19)

Regulation (EU) 2022/2065 provides for coordination mechanisms to react to emergency situations. However, as the recent events demonstrate, extraordinary circumstances are already occurring before 17 February 2024, affecting the European digital space. Such extraordinary circumstances, triggered by specific incidents or crisis arising from the dissemination of illegal content, pose a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof. This situation requires coordinated action at Union level now, well before the application date of the relevant provisions in Regulation (EU) 2022/2065 (e.g. 17 February 2024).

(20)

With regard to such emergency threats, action taken at national level to act against the amplification of illegal content online can risk being uncoordinated, leading to legal fragmentation and uncertainty, and increase friction and response times. Furthermore, as recognised by Regulation (EU) 2022/2065, the Commission is better placed to enforce the Regulation as regards the systemic application of the rules by very large online platforms and very large online search engines’ providers. With this in mind, Member States should be encouraged to act in a coordinated manner in support of the eventual enforcement actions that the Commission may take when fulfilling its role set out in Regulation (EU) 2022/2065.

(21)

Involving law enforcement in the planning of national response to tackle illegal content is important so that taken or planned measures do not interfere with their work, in particular when there is an imminent threat to life.

(22)

In view of the unprecedented period of conflict and instability affecting the Union, this recommendation sets out mechanisms of preparedness, cooperation and coordination between the Commission and the Member States ahead the full application of Regulation (EU) 2022/2065 on 17 February 2024, in a spirit of sincere cooperation, to allow a speedy transition towards the application of that Regulation and to ensure its full effectiveness since inception. This Recommendation does not aim at replacing or supplementing the mechanisms of enforcement nor the framework of obligations laid down in Regulation (EU) 2022/2065.

(23)

The Commission will assess the experience in the application of this Recommendation once it expires, i.e. when Regulation (EU) 2022/2065 enters fully into application.

(24)

This Recommendation should apply until 17 February 2024,

HAS ADOPTED THIS RECOMMENDATION:

PURPOSE OF THIS RECOMMENDATION

This Recommendation encourages Member States to respond in a coordinated and consistent manner to incidents, in particular arising from the dissemination of illegal content, posing a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof, including those which risk leading to a serious threat to public security or public health in the Union or in significant parts of it, vis-à-vis designated very large online platforms and very large online search engines pursuant to Regulation (EU) 2022/2065, prior to 17 February 2024.

DEFINITIONS

For the purposes of this Recommendation the following definitions apply:

(a)

‘Digital Services Coordinator’ means the Digital Services Coordinator designated by each Member State pursuant to Article 49 of Regulation (EU) 2022/2065;

(b)

‘Board’ means the European Board for Digital Services established pursuant to Article 61 of Regulation (EU) 2022/2065;

(c)

‘Very large online platforms’ and ‘very large online search engines’ means online platforms and online search engines designated pursuant to Article 33(4) of Regulation (EU) 2022/2065.

SPECIFIC RECOMMENDATIONS

Informal network of prospective Digital Services Coordinators for cooperation and coordination prior to 17 February 2024

1.

Prior to 17 February 2024, the Member States are encouraged, through an informal network (‘the Informal Network’), to coordinate their actions in relation to the dissemination of illegal content on very large online platforms and on very large online search engines that have already been designated pursuant to Article 33(4) of that regulation.

2.

Those Member States that have already appointed, or at least identified, their independent Digital Services Coordinator pursuant to Article 49 of Regulation (EU) 2025/2065 are encouraged to share with the Commission the contact details of the responsible authority that has been or will be designated. All other Member States are encouraged to do so as soon as possible, including on an ad hoc basis to participate to the network of prospective Digital Services Coordinators as per paragraph 1.

3.

Other Member States are encouraged to appoint a high-level official to participate in the Informal Network and to share with the Commission the contact details of the authority which that official represents that can serve as a point of contact until the designation of their Digital Services Coordinator.

Specific meetings to coordinate responses

4.

The Commission on its own initiative or upon recommendation of one or more of the members of the informal network of prospective Digital Services Coordinators could convene the Informal Network of prospective Digital Services Coordinators. The Informal Network is recommended to cooperate with the Commission to respond via specific meetings to incidents, in particular arising from the dissemination of illegal content, posing a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof, including those which risk leading to a serious threat to public security or public health in the Union or in significant parts of it.

5.

The Commission encourages the Member States to participate actively in meetings of the Informal Network.

6.

Those response coordinated by the Informal Network could include the following tools:

Regular incident response meetings

7.

In the context of such incidents, the Commission recommends that the informal network of prospective Digital Services Coordinators meets on a regular basis to achieve a coordinated understanding of the development of the extraordinary circumstances at national level and to propose a framework for any follow-up action that may be considered necessary in view of the identified extraordinary circumstances.

8.

Such meetings should cover the following aspects:

(a)

exchange of information, good practices, methodologies, technical systems and tools with the purpose of supporting supervisory efforts regarding very large online platforms and very large online search engines in the context of crisis;

(b)

exchange of information collected at national level from competent national authorities as regards the identification of online illegal content related to the situation of crisis and its amplification by very large online platforms and very large online search engines, including, where available, information regarding its effect in the local public opinion.

Information gathering

9.

The informal network of prospective Digital Services Coordinators could, where relevant, provide meaningful information regarding the functioning and design of relevant very large online platforms and very large online search engines, gathered in the exercise of their respective tasks and within their competences set out in Regulation (EU) 2022/2065 or from other competent authorities in their respective Member State.

Support to the Commission in monitoring and enforcing Regulation (EU) 2022/2065

10.

Prior to 17 February 2024, the Member States are encouraged to assist the Commission in the exercise of its powers under Section 4 of Chapter IV of Regulation (EU) 2022/2065 disseminated in relation to providers of very large online platforms and of very large online search engines that have already been designated pursuant to Article 33(4) of that regulation.

11.

Such assistance may consist of

(a)

assisting the Commission in conducting interviews pursuant to Article 68 of Regulation (EU) 2022/2065;

(b)

assisting the Commission in carrying out inspections within their territory pursuant to Article 69 of Regulation (EU) 2022/2065, in accordance with the applicable national laws and procedures

Encouraging participation in existing voluntary cooperation frameworks

12.

Member States are further encouraged to participate in existing voluntary cooperation frameworks to address the dissemination of illegal content online. Such voluntary cooperation frameworks include, in particular, the EU Crisis Protocol, which provides for a voluntary mechanism to respond to a suspected crisis in the online space, stemming from a terrorist or a violent extremist act. Moreover, Member States are encouraged to coordinate via international fora for counter terrorism, such as the Christchurch Call and the industry-led Global Internet Forum to Counter Terrorism.

Period of application

13.

This Recommendation shall apply until 17 February 2024.

14.

This Recommendation is addressed to the Member States.

Done at Brussels, 20 October 2023.

For the Commission

Thierry BRETON

Member of the Commission


(1)  Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) (OJ L 277, 27.10.2022, p. 1).

(2)  Article 1(1) of Regulation (EU) 2022/2065.

(3)  The list of designated services was published in the Official Journal of the European Union, pursuant to Article 33(6) of Regulation (EU) 2065 (OJ C 249, 14.7.2023, p. 2).

(4)  See Article 49(3) of Regulation (EU) 2022/2065.

(5)  Pursuant to Article 56(2) of Regulation (EU) 2022/2065, the Commission enjoys exclusive powers to monitor and enforce Section 5 of Chapter III of Regulation (EU) 2022/2065, which contains enhanced due obligations applicable to designated very large online platforms and very large online search engines. Pursuant to Article 56(3) of that regulation, the Commission also enjoys the power to monitor and enforce the due diligence obligations laid down in that regulation, other than those laid down in Section 5 of Chapter III thereof, against providers of very large online platforms and of very large online search engines.

(6)  Pursuant to Article 61 of Regulation (EU) 2022/2065, the Board is an independent advisory group, made up of the Digital Service Coordinators, on the supervision of providers of intermediary services.

(7)  Pursuant to Article 63 of Regulation (EU) 2022/2065, the Board shall, inter alia, advise the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis-à-vis providers of very large online platforms or of very large online search engines and having regard, in particular, to the freedom of the providers of intermediary services to provide services across the Union.

(8)  Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).

(9)  Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (OJ L 172, 17.5.2021, p. 79).

(10)  Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law (OJ L 328, 6.12.2008, p. 55).


ELI: http://data.europa.eu/eli/reco/2023/2425/oj

ISSN 1977-0677 (electronic edition)


Top