Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC

Digital Services Act (DSA)
Document type
Legislation
Daphne
Keller
Joan
Barata

The DSA represents and overhaul of EU law governing intermediaries’ handling of user content. It builds on the pre-existing eCommerce Directive from 2000, and preserves key ideas and legal structures from that law. 

The DSA applies to numerous Internet intermediary services. It provides both immunities and obligations. Many of its specific rules apply only to services in specific categories (access, caching, hosting, and marketplace providers, for example). The DSA asserts significant jurisdiction over companies based outside the EU. It reaches services “directed” to EU Member States. It allows enforcers to assess extremely steep fines, in principle reaching up to 6% of annual revenue. It also sets up major new regulatory powers within the European Commission.

Obligations that relate to the DSA’s core concern with content moderation include:

  • Responding to government-issued orders to remove content or disclose user data 
  • Putting a lot of information about content moderation in Terms of Service, and notifying users of changes (that last part could get burdensome for platforms, and annoying for users, if it is not interpreted flexibly). 
  • Publishing transparency reports 
  • Building mechanisms for users to notify platforms about prohibited content The DSA does not prescribe turnaround times for response to notices. However, despite significant civil society opposition, the final draft apparently suggests a period of 24 hours for illegal hate speech.
  • Notifying users when their content has been taken down or otherwise acted against, providing an internal appeal mechanism, and engaging with and paying for alternative dispute resolution (!!) if users disagree with outcomes. Platforms must terminate users who repeatedly violate the rules or file abusive takedown requests. 
  • Engaging with and creating special channels for government-approved “trusted flaggers” in each EU country to notify platforms of prohibited content. 
  • Notifying law enforcement of suspicion of serious crimes 
  • For marketplaces, extensive new duties vetting vendors and providing information to users. 
  • Controversial “crisis protocols,” allowing EU Commission officials to require content removal under to-be-determined rules in crisis situations. These are primarily for VLOPs, but seemingly may also apply to smaller platforms. 
  • Having a point of contact and a legal representative in Europe

The DSA also speaks to some ongoing tensions in intermediary liability law by reiterating the EU’s longstanding (but evolving) prohibition on “general” monitoring obligations, and specifying that platforms’ voluntary efforts to find and remove illegal content should not cause them to lose immunity. 

A few of the DSA’s new obligations, including some added later in the legislative process, are less directly tied to content moderation. Some of these are less clearly prescriptive, and will likely require more legal judgment calls in interpretation.

  • Providing users with information, accessible directly from the UI, about ads and recommendations 
  • For porn sites/apps, additional content moderation obligations
  • Designing to avoid “dark pattern” nudges to shape user behavior or choices
  • No targeted ads to known minors, or to anyone based on sensitive personal information such as health or sexual orientation unless the user affirmatively provided that information. (This was negotiated to the last minute, is a big deal, in many ways has more in common with GDPR than the DSA, and is 100% worthy of real legal consultation for any affected business.) 

The “Very Large Online Platforms” or VLOPs, which have at least 45 million monthly active users in the EU have additional obligations. These are in many cases less prescriptive, and more about creating institutional governance systems to handle evolving risks.

VLOPs are responsible for:

  • Annual formal risk assessments, risk mitigation plans, and third party audits, and resulting reporting to regulators
  • Ongoing engagement with regulators, who may effectively shape platform practices through feedback on risk mitigation plans; facilitating the development of “voluntary” industry standards and codes of conduct ; and various enforcement powers including investigations, requests for information, on-site inspections, the imposition of “interim measures” and ordering platforms to “take the necessary measures to ensure compliance” with Commission requirements under the DSA.   
  • Appointing compliance officers with special obligations and resources.
  • Paying a yearly fee of their global revenues to fund the new regulator.
  • Providing vetted researchers and regulators with access to internal data.
  • Labeling deep fakes.
  • Removing content in emergencies in compliance with crisis protocols.
  • Some other important obligations that are analogous to, but more expansive than, related obligations in earlier, non-VLOP-specific parts of the DSA
  •      Explaining recommender systems and allowing users to choose non-personalized versions.
  •      Maintaining public repositories of information about ads.
  •      Publishing transparency reports more frequently and with some additional information about risk assessment, mitigation, and audits.

Some of these obligations also apply to “Very Large Search Engines”..

 

Topic, claim, or defense
General or Non-Specified
Freedom of Expression
Jurisdiction
Document type
Legislation
Issuing entity
Legislative Branch
Type of service provider
Host (Including Social Networks)
Web Host (Technical Hosting)
Search Engine or Index
Internet Access Provider (Including Mobile)
DNS Provider
Cache Provider
P2P
App
Marketplace
Advertising
Issues addressed
Notice Formalities
Trigger for OSP obligations
Procedural Protections for Users and Publishers
Transparency
Limitation on Scope of Compliance (Geographic, Temporal, etc.)
OSP obligation considered
Block or Remove
Monitor or Filter
Account Termination
Data Retention or Disclosure
General effect on immunity
Mixed/Neutral/Unclear
General intermediary liability model
Takedown/Act Upon Knowledge (Includes Notice and Takedown)