by A DE STREEL — Case C-314/12 UPC Telekabel Wien v Constantin Film Verleih GmbH C-46, available at:

174 KB – 102 Pages

PAGE – 1 ============
Online Platforms ‘ Moderation of Illegal Content Online Law, Practices and Options for Reform Policy Department for Economic, Scientific and Quality of Life Policies Directorate -General for Internal Policies Authors: Alexandre DE STREEL et al. PE 652.718 – June 2020 EN STUDY Requested by the IMCO committee

PAGE – 3 ============
Abstract Online platforms have created content moderation systems, particularly in relation to tackling illegal content online. This study reviews and assesses the EU regulatory framework on content moderation and the practices by key online platforms. On that basis, it makes recommendations to improve the EU legal framework within the context of the forthcoming Digital Services Act. This document was provided by the Policy Depart ment for Economic, Scientific and Quality of Life Policies at the request of the committee on Internal Market and Consumer Protection (IMCO). Online Platforms ‘ Moderation of Illegal Content Online Law, Practices and Options for Reform

PAGE – 4 ============
This document was requeste d by the European Parliament ‘s committee on Internal Market and Consumer Protection. AUTHORS UNIVERSITY OF NAMUR (CRIDS/NADI): Alexandre DE STREEL, Elise DEFREYNE, Hervé JACQUEMIN, Michèle LEDGER, Alejandra MICHEL VVA: Alessandra INNESTI, Marion GOUBET, Dawid USTOWSKI ADMINISTRATOR RESPONSIBLE Christina RATCLIFF EDITORIAL ASSISTANT Roberto BIANCHINI LINGUISTIC VERSIONS Original: EN ABOUT THE EDITOR Policy departments provide in -house and external expertise to support EP committees and other parliamentary bodies in shaping legislation and exercising democratic scrutiny over EU inte rnal policies. To contact the Policy Department or to subscribe for email alert updates, please write to: Policy Department for Economic, Scientific and Quality of Life Policies European Parliament L-2929 – Luxembourg Email: Poldep -Economy Manuscript completed: June 2020 Date of publication: June 2020 © European Union , 2020 This document is available on the i nternet at: -analyses DISCLAIMER AND COPYRIGHT The opinions expressed in this document are the sole responsibility of the authors and do not necessarily represent the official position of the European Parliament. Reproduction and translation for non -commercial purposes are authorised, provided the source is acknowledged and the European Parliament is given prior notice and sent a copy. For citation purposes, the study should be referenced as: De Streel, A. et al., Online Platforms ‘ Moderation of Illegal Content Online , Study for the c ommittee on Internal Market and Consumer Protection , Policy Department for Economic, Scientific and Quali ty of Life Policies, European Parliament, Luxembourg, 2020 . © Cover image used under licenc e from Adobe Stock

PAGE – 5 ============
Online Platforms’ Moderation of Illegal Content Online 3 PE 652.718 CONTENTS LIST OF ABBREVIATION S 6 LIST OF FIGURES 8 LIST OF TABLES 8 EXECUTIVE SUMMARY 9 1. SCOPE AND OBJECTIVES OF THIS STUDY 14 2. EU REGULATORY FRAMEW ORK ON ONLINE CONTEN T MODERATION 15 2.1. Definition of illegal content online 16 2.1.1. Online content illegal under EU law 16 2.1.2. Illegal content online under national law 17 2.2. EU regulatory framework on moderation of illegal content online 18 2.2.1. EU rule s applicable to all online platforms 19 2.2.2. Additional rules applicable to Video -Sharing Platforms 24 2.2.3. Stricter rules applicable for terrorist content 25 2.2.4. Stricter rules applicable for child sexual abuse material 28 2.2.5. Stricter rules applicable for racis t and xenophobic hate speech 29 2.2.6. Stricter rules applicable for violation of Intellectual Property 31 2.2.7. Summary of the EU regulatory framework 32 2.3. EU rules regarding the moderation of online disinformation 34 2.4. Summary of some national laws and initiatives on online content moderation 36 3. ONLINE MODERATION PRACTICES AND THEIR EFFECTIVENESS 40 3.1. Measures taken by stakeholders and their effectiveness 43 3.1.1. Moderating measures deployed by online platforms 43 3.1.2. Online platforms ‘ perspective on the effectiveness of the deployed moderating measures 44 3.1.3. Other stakeholders ‘ perspective on the effectiveness of the moderating practices 45 3.2. Involvement of platforms ‘ users in reporting illegal content online 46 3.2.1. Online platforms ‘ perspective 46 3.2.2. Other stakeholders ‘ perspective 49 3.3. Challenges in moderating illegal content online 51 3.3.1. Challenges in moderating and reporting illegal con tent online and enforcing legal rules 51 3.3.2. Duty of care regimes 53 3.3.3. Solutions to improve the moderation of illegal content by online platforms 54 3.4. Other issues 57

PAGE – 6 ============
IPOL | Policy Department for Economic, Scientific and Quality of Life Policies PE 652.718 4 3.4.1. Liability under the e -Commerce Directive 57 3.4.2. Freedom of speech issues 58 3.4.3. Online discrimination issues 59 3.5. Specific private initiatives 60 3.5.1. Facebook: Oversight Board for content moderation decisions 60 3.5.2. Article 19 : Social Media Council initiative 61 3.5.3. Twitter: BlueSky initiative to build decentralised standards for social networks 62 3.6. Specific practices during the COVID -19 pandemic 62 3.6.1. Specific measures to tackle illegal content online 62 3.6.2. Specific measures to tackle online disinformation 63 3.6.3. Results from the platforms interviews 64 4. INTERNATIONAL BENCHM ARKING 66 4.1. United States 67 4.1.1. Regulatory and policy framework 67 4.1.2. Recommendations on best practices 68 4.2. Canada 69 4.2.1. Regulatory and policy framework 69 4.2.2. Recommendations on best practices 70 4.3. Australia 71 4.3.1. Regulatory and policy framework 71 4.3.2. Recommendations on best practices 71 4.4. Latin American countries 72 4.4. 1. Regulatory and policy framework 72 4.4.2. Recommendations on best practices 73 4.5. China 74 4.5.1. Regulatory and policy framework 74 4.5.2. Recommendations on best practices 75 4.6. Japan 75 4.6.1. Regulatory and policy framework 75 4.6.2. Recommendations on best practices 75 5. POLICY RECOMMENDATIO NS FOR THE DIGITAL S ERVICES ACT 76 5.1. Principles on which a reform should be based 77 5.2. The baseline regime: strengthening procedural accountability of online platforms 78 5.2.1. Increased role for users and trusted flaggers 79 5.2.2. Preventive measures 79 5.3. Aligning responsibility with risks 80

PAGE – 8 ============
IPOL | Policy Department for Economic, Scientific and Quality of Life Policies PE 652.718 6 LIST OF ABBREV IATIONS ADR Alternative Dispute Resolution AI Artificial Intelligence AVMSD Audio -Visual Media Service Directive BEUC European Consumer Organisation CCIA Computer and Communications Industry Association CDA Communications Decency Act CDSMD Copyright in the Digital Single Market Directive CDT Center for Democracy & Technology CEN Comité Européen de Normalisation Œ European Committee for Standardisation CENELEC Comité Européen de Normalisation en Electronique et en Electrotechnique Œ European Committee for Electrotechnical Standardisation CEO Chief Executive Officer CEP Counter Extremism Project CITES Convention on International Trade in Endangered Species of Wild Fauna and Flora CPC Consumer Protection Cooperation CRFD Counter -Racism Framework Decision CSO Civil Society Organisation CSAED Child Sexual Abuse and Exploitation Directive CSAM Child Sexual Abuse Material CTD Counter -Terrorism Directive DSA Digital Services Act ECD e-Commerce Directive ECHR European Court of Human Rights EDiMA European Digital Media Association ERGA European Regulators Group for Audio -Visual Media Services ETSI European Telecommunications Standards Institute EU European Union Euro ISPA European Internet Services Providers Associations FRA European Union Agency for Fundamental Rights GDPR General Data Protection Regulation HLEG High -Level Expert Group

PAGE – 9 ============
Online Platforms’ Moderation of Illegal Content Online 7 PE 652.718 HSP Hosting Service Provider ICT Information and Communication Technologies IPR Intellectual Property Rights ISP Internet Service Provider KPI Key Performance Indicator MoU Memorandum of Understanding MS Member State N&A Notice -and -Action NCMEC National Cent er for Missing and Exploited Children (US) NetzDG German Network Enforcement Act NGO Non -Governmental Organisation OFCOM The regulator and competition authority for the UK communications industries PSCSP Public Space Content -Sharing Platform SMC Social Media Council TERREG Proposal for a Regulation on preventing the dissemination of terrorist content online UK United Kingdom UN United Nations URL Uniform Resource Locator US United States VSP Video -Sharing Platforms WHO World Health Organiz ation

PAGE – 10 ============
IPOL | Policy Department for Economic, Scientific and Quality of Life Policies PE 652.718 8 LIST OF FIGURES Figure 1: EU regulatory framework for online content moderation 19 Figure 2: Survey replies per type of stakeholders 42 LIST OF TABL ES Table 1 : Main EU rules against illegal content online 33 Table 2 : Comparing EU legislations on online content moderation 34 Table 3: Comparing national laws or initiatives in Germany, France and the UK 38 Table 4: Online content moderation practices in times of COVID -19 63

PAGE – 11 ============
Online Platforms’ Moderation of Illegal Content Online 9 PE 652.718 EXECUTIVE SUMMARY EU regulatory framework on online content moderation The EU regulatory framework on content moderation is increasingly complex and has been differentiated over the years according to the category of the online platform and the type of content reflecting a risk -based approach. The e -Commerce Directive of 2000 contains the baseline regime applicable to all categories of platforms and all types of content . The Directive prov ides the following rules: (i) the ‘country of origin ‘ principle, which is the cornerstone of the Digital Single Market ; (ii) an exemption of liability for hosting platforms which remain passive and neutral and which remove the illegal content online as soo n as they are made aware of it ; (iii) the prohibition of general monitoring measures to protect fundamental rights; and (iv) the promotion of self – and co -regulation as well as alternative dispute resolution mechanisms. This baseline regulatory r egime has been complemented in 2018 by the revised Audio -Visual Media Services Directive, which imposes more obligations to one category of online platforms, the Video -Sharing Platforms . They should take appropriate and proportionate measures , preferably through co -regulation, in order to protect the general public from illegal content ( terrorist content, child sexual abuse material, racism and xenophobia or other hate speech ), and to protect minors from harmful content. Those measures must be appropriate in the light of the nature of the content, the category of persons to be protected and the rights and legitimate interests at stake and be proportionate taking into account the size of the platforms and the nature of the provided service. Those rules are then strengthened by stricter rules for four types of content for which illegality has been harmonised at the EU level : first, the Counter -Terrorism Directive defines the public provocation to commit a terrorist offence and requires, following transparent procedures and with adequate safeguards, Member States to take removing and blocking measures against websites containing or disseminating terrorist content . The European Commission aims to go further and has made a proposal, which has not yet been adopte d by the EU co -legislators, for a regulation which would require hosting services providers to take measures to remove terrorist content ; second, the Child Sexual Abuse and Exploitation Directive defines child pornography and requires, following transparent procedures and with adequate safeguards, Member States to take removing and blocking measures against websites containing or disseminating child sexual abuse material ; third, the Counter -Racism Framework Decision provides that Member States mus t ensure that racist and xenophobic hate speech is punishable , but does not impose detailed obligations related to online content moderation practices; fourth, the Copyright in Digital Single Market Directive establishes a new liability regime for online c ontent -sharing platforms ; they must conclude an agreement with the right s-holders for the exploitation of the works and, if they fail to do so, they are liable for the content violating copyright on their platforms unless they make their best effort to alleviate such violations. Those stricter rules imposed b y EU hard -law are all complemented by self -regulatory initiatives agreed by the main online platforms, often at the initiative of the European Commission. They contain a range of commitments, some of which are directly related to content moderation practices and others which support such practices. However, the evaluation of those initiatives shows difficulties in measuring the commitments taken and in reporting on their effectiveness.

174 KB – 102 Pages