BRUSSELS – Today, global tech trade association ITI submitted recommendations to help inform the European Commission’s approach to the Digital Services Act (DSA). In the recommendations, issued ahead of the EC’s public consultation for the DSA, ITI underscores its support of the goals of the measure to increase legal certainty, clarify roles, and define responsibilities for actors in the online context. The recommendations also outline steps to develop a balanced framework that preserves the current balance on intermediary liability rules and rights of third parties for a healthy online ecosystem.

“The online platform ecosystem plays a foundational role in driving innovation and growth in the economy, supporting the smooth operation of supply chains and creating market opportunities for businesses of all sizes,” said Guido Lobrano, ITI’s Vice President of Policy, Europe. “Policymakers around the world are grappling with real challenges caused by the scale, speed, and complexity of various types of platforms, the roles they play regarding content and activities online, and their ability to shape public opinion. We recognise the shared responsibility to maintain a safe, inclusive, and innovative online environment. All relevant players need to work together to ensure that the internet has sufficient protections for users as well as smaller businesses and brands. Any new regime should allow companies to develop effective and responsible systems of content oversight, tailored towards their specific risk, exposure, and technical capabilities. We support the European Commission’s thoughtful acknowledgment of the importance of existing legal principles for the economy at large. We welcome plans to gather robust stakeholder input and develop well-tailored solutions for specific, well-defined challenges.”

ITI makes the following recommendations in its paper:

  • The differentiation between illegal and harmful content needs to be maintained. Regulatory efforts should focus on illegal content as defined by existing law - including both civil and criminal infringements, with no distinction being made in the application of the liability rules. Harmful, but not illegal, content should continue to be addressed separately through voluntary or co-regulatory approaches.
  • Internet platforms are transversal actors across the global economy and supported by complex supply chains. Any future initiative on oversight of illegal or harmful online content should focus on a company’s role and its interaction with the content so as to identify the actors best placed to take action – differentiating where services may have the ability or right (contractual, legal or otherwise) to edit, moderate, or manage content versus where they have technical control but often no access or capability to alter or remove the data.
  • As it seeks to improve incentives for companies to take proactive steps to create a safer online environment, the EU should seek to preserve features of the E-Commerce Directive (ECD) proven to work including the country-of-origin principle, and liability provisions. The Commission should retain proven elements of the ECD, such as the country-of-origin principle (also referred to as Internal Market clause), ensuring that online service providers are subject to the law of the Member State in which they are established and not the various legal systems of the Member States where the service is accessible.
  • Legal fragmentation in the European Single Market needs to be avoided. National governments have surged ahead with legislative approaches to the online content oversight debate. Further, new collaborative economy services are reluctant to set foot in many European markets, due to diverging national and at times even regional or local rules. Legal fragmentation hinders the ability of start-ups to scale up and compete globally.
  • Creating a framework that removes illegal content and encourages good faith actions requires partnership between stakeholders. Tackling the proliferation of illegal content must be a shared responsibility between platforms, authorities, and users.
  • Notice-and-takedown models should have incentivising provisions. An effective N&T process should remain an important part of the new regime. Whilst platforms with effective control have the responsibility to make the N&T processes efficient, accessible, and transparent, notifiers must also use them responsibly.
  • Oversight of systems and processes can complement liability provisions. Companies need clear rules and responsibilities that do not disincentivise their proactive actions to limit distribution of illegal content online. To most effectively achieve the goal of removing illegal content or activities without delay, any new regime should allow companies to develop effective and responsible systems of content oversight, tailored towards their specific risk, exposure, and technical capabilities.
  • Companies have an interest in maintaining the trust of their stakeholders. In order to foster trust, Internet companies have an interest in providing information regarding their content moderation tools and measures to users and governments in a transparent manner.
  • The EU can play a central role for global policy leadership on the framework of digital services. Moving beyond the EU level, we also observe that a pragmatic approach towards content oversight and intermediary liability that allows for greater global regulatory convergence would make obvious sense, as it would help protect citizens around the world more evenly, while allowing companies to deploy consistent actions addressing these challenges worldwide.

Read ITI’s full submission here.