By using eBay Main Street, you agree to our use of cookies to enhance your experience.

Platform Regulation

At a Glance

As part of the European Digital Strategy, the European Commission has announced a Digital Services Act (DSA) that aims to reinforce the Single Market for digital services and to promote innovation and competitiveness of the EU digital economy. The objectives of the European Commission are to enhance responsibilities of all online platforms with regards to illegal content. The EU will also look at creating specific “ex ante” rules for platforms with a dominant market position (a.k.a. “gatekeepers”) under a new Digital Markets Act (DMA).

Issue in Detail

The e-Commerce Directive (ECD) principles, adopted in 2000, set up the basic legal framework for information society services and provide the backbone for a dynamic European digital single market, essential for eBay users to thrive. The directive recognizes the opportunities e-commerce provides for consumers and businesses, especially SMEs, and it continues to play an essential role for the growth and continued evolution of the European online economy.

Today, eBay welcomes the Commission's intention to adopt a Digital Services Act that would deepen the internal market and clarify responsibilities for digital services. On this occasion, eBay would like to share propositions aimed at making the future Digital Services Act a key tool to empower European citizens and businesses, through the online platforms they use every day, to reach the potential offered by the Single Market while guaranteeing their rights and competitiveness.

Our recommendations in detail

1. Protect SME independence by maintaining a regulatory distinction between their responsibilities and those of online marketplaces

Trust is fundamental in achieving a safe environment both for online marketplaces and their users; the presence of illegal content online undermines that trust and is therefore strongly battled against by online platforms. As such, they have adopted a variety of content moderation practices based on platform-specific aspects such as the type of content that is being moderated (i.e. product listings vs. hate speech), own rules and policies, internal handling processes, investment of resources, etc. Considering that such aspects vary from intermediary to intermediary, it seems necessary for the good functioning and efficiency of content moderation processes to allow platforms flexibility in their moderation, relying on these specific aspects while imposing certain minimum requirements.

At the same time, there is a need to ensure that small businesses can participate in the online commerce market in true independence and with a fair relationship to the platforms that they use to reach customers. This is achieved by maintaining a regulatory distinction between the responsibilities of marketplaces and retailers that use them. In any other scenario, marketplaces will require a tight directional and physical control over the business operations of small retailers using their services, effectively turning those SMEs into mere suppliers without operational independence. As such, we highly recommend that the hosting status and attached liability regime of online platforms be preserved. This provides platforms with legal certainty to continue efforts to ensure that the marketplace is safe for all users, without pushing into over-blocking to avoid liability for goods and services over which no direct control is owned (as opposed to retail platforms).

As a result, we believe the DSA should clarify that:

  1. Internet service providers that take action in good faith in relation to content on their platforms do not necessarily have "actual knowledge" or "control" under Articles 14(1) and (2)
  2. Such good faith service providers are protected from liability to users for claims based on the removal of content suspected of being illegal.
  3. The concept of ‘actual knowledge’ as being triggered when a natural or legal person has been made aware of such content, and not by a filter or other automated tools.
  4. Actual knowledge should be triggered when an online service provider receives a specific and clearly identifiable information, meaning a notification that includes the following:
    • Clear identification of the complainant (name, address) who must also be in a legitimate position to send a notice (e.g. the owner of IP rights on the item, or a market surveillance authority for unsafe products).
    • A notice made in writing with adequate information of the material alleged to be infringing, including precise localization (e.g. a URL to the listing on eBay)
    • Sent to an email address or other secure method reserved for this purpose by the platform
    • Details, including legal ground to demonstrate the unlawful nature of the content in question
    • Except for certain high-risk cases, proof that the complainant could not reach the content’s author or editor

Top ^

2. Preserve platform users’ rights by introducing fair content moderation and appeal procedures that do not mandate the use of automated tools

Marketplaces have a longstanding record of cooperating with regulators, right owners and other stakeholders in order to proactively address illegal content and provide for a safe environment for customers to trade in. The Digital Services Act should establish a robust horizontal framework that tackles illegal content by acknowledging the responsibility of all actors participating in the hosting ecosystem.

First, the Digital Services Act must focus its rules on the moderation of illegal, instead of harmful content.

Due to varied interpretations, the same type of content may be considered illegal, legal but harmful or legal and not harmful across different Member States. A clear definition of illegal content in both EU and national law would allow for more rapid and effective action. Because the management of harmful content or activity requires nuance, a specific focus on the management of illegal content and activity at EU level would help avoiding breaches of fundamental rights in more context-specific cases. Finally, the specificity of IP infringements must be recognized, in that they usually require detailed expertise from a third-party, the rights owner, to be detected.

Second, requirements placed on marketplaces must be principle-based and adapt to the nature and scale of any platform.

In order to ensure fair and equal treatment of all intermediaries from a competition standpoint, as well as better efficiency in identifying unsafe products, European policymakers must not oblige platforms to use any specific technology or amount of human resources. Rather, marketplaces must be made to follow minimal principle-based requirements, such as to proactively consult information on recalled products available on the Rapid Alert System (RAPEX), and to have an internal notice-and-take-down procedure which is proven to meet certain efficiency standards (see below).

Third, clear delays must be determined for the platform to be held liable to act on the notice.

The Digital Services Act offers an opportunity to determine both precise and acceptable delays for an online platform to act on a notice. First, the legitimacy of certain notices may require additional information to be ascertained. Thus, action on a notice by a platform must not be equated to the content’s removal, but rather must be expanded to include any response by the platform on the content or as provided to the notice issuer. Second, an EU-harmonized approach must be adopted that matches delays with categories of infringements per appropriate level of risk. Existing initiatives must be followed; such as the Product Safety Pledge’s 2 working days for online marketplaces acting against unsafe products.

Fourth, counter-notice rights and procedures must be created.

Counter-notices are essential to safeguard the rights of platform users, as they provide them with a right to reply to an alleged violation of policies or regulation. Online platforms are often not in a position to decide on the legality of certain content, and facts are regularly disputed by parties involved. As such, a provision must be introduced that exempts the online intermediary from liability for a piece of content taken down or kept online when a counter-notice is issued. The dispute must be resolved first, if necessary, with the help of a court decision, before the platform can be made responsible for any action taken with respect to the content.

Top ^

3. Reduce fragmentation of the current regulatory framework for online services in Europe

We have observed the existence of significant fragmentation between the EU Member States regarding the implementation and enforcement of the e-Commerce Directive and its principles.  The ECJ has confirmed in a number of landmark rulings the general principle of protection against liability and the prohibition of monitoring obligations for marketplaces. The Digital Services Act provides an opportunity to reinforce the main aspects of the ECD by codifying the resulting European Court of Justice (ECJ) case law deriving from the Directive, and notably to introduce select amendments to clarify that its Recital 42 does not apply to hosting service providers as designated by Article 14, specifically as long as they do not have “control” over the service recipient or “actual knowledge” of specific instances of illegal activity or information. In that manner, actions taken in good faith by online intermediaries to fight against illegal content, such as filters and customer due diligence will be further incentivized, as well as the provision of an adequate level of information in notices (see above) by their issuers.

Top ^

4. Ensure harmonized enforcement of the new rules and adoption of sanctions proportionate to harm caused

Any attempt to grant enforcement powers to oversight authorities should ensure that sanctions are only imposed when there is a systemic failure to comply. The concept of "systemic failure" should be well defined and sanctioning should take into account several aspects such as:

  • Gravity and nature of the infringement: the sanctions should take into account the circumstances of the infringement, such as its nature, process and reason, the number of individuals affected and the damage suffered.
  • Intention: whether the infringement was intentional or the result of negligence.
  • Mitigation: whether actions have been taken by the hosting provider to mitigate the damage.
  • Precautionary measures: the measures undertaken by the hosting provider to comply with technical and organizational requirements.
  • History: any relevant previous infringements which showcase that the hosting provider has a history of systemic non-compliance.
  • Cooperation: the actions undertaken by the hosting provider to cooperate with the relevant authorities to identify and remedy the infringement.
  • Aggravating/mitigating factors: any other issues arising from circumstances of the case, such as misleading notifications that do not include the necessary information for identifying the listing.

In addition, often existing national regulatory structures are not being well equipped my Member States. Market surveillance authorities, in charge of monitoring and testing the safety of products, have been historically underfunded, leading to their limited capacity to monitor unsafe products, which undermines the single market for goods and the European economy, as compliance with standards and requirements is not efficiently monitored. In addition, these are limited by national boundaries and coordination between authorities has been proven to be minimal. We therefore welcome the Commission's intention to reinforce cross-border cooperation in this regard.

Top ^

5. Speed up action against certain anti-competitive practices by introducing efficient ex-ante rules for gatekeeper platforms

The emergence of online intermediaries, together with the increased availability of information and data, have led to increased consumer choice, access to information and new economic opportunities. Many of these recent digital innovations have significantly benefited European citizens.

Perhaps the most momentous recent development of the digital economy is the digital ecosystem platform. Usually with a dominant platform at the core, an ecosystem company then operates a variety of complementary, interlocking businesses and services that often span several markets while being technically (and financially) integrated with the core platform. In light of this, discussions regarding the need for regulation to complement competition law have emerged.

We therefore welcome the opportunity to stress that current competition law and enforcement should be supplemented by new legislation, because the inherent conflict of interest within large digital ecosystem platforms invites certain practices that suppress fair competition in Europe, as rightly identified by the European Commission. The objective of such regulation should be to intervene more rapidly than can be achieved through current (ex post) enforcement. The public policy objective is clear: to avoid a future where a handful of “digital conglomerates” dominate most markets and competition is confined between these ecosystems.

Top ^