Content moderation on online platforms (Nr. 515) © Photo Credit: Andreas Prott - stock.adobe.com

Content moderation on online platforms (Nr. 515)

In this discussion paper, we examine the complex process of content moderation on online platforms and consider both the incentive structures of platform providers and the limits of automated moderation as well as the regulatory context.

In light of the adoption of the Digital Services Act in 2022, which provides for a new horizontal legal framework for the regulation of digital services, this discussion paper analyses the moderation of content on online platforms. Content moderation is a very complex and challenging task. It is influenced by many factors:

  1. the incentives of the platform providers
  2. the capabilities of the moderation process, including the automated systems and human moderators used, and
  3. the regulatory requirements.

A major problem is that today's moderation processes are not free from inaccuracies and lack of transparency. This mixture of error-proneness and lack of transparency makes it difficult to understand moderation decisions and to differentiate whether the purely technical moderation process or the moderation policy of the platform providers or a combination of both, which can be mutually dependent, are responsible for the moderation decisions made. There is also the question of whether this results in risks for society and the individual that need to be mitigated and where the corresponding responsibilities lie. This is particularly problematic in the case of very large platforms, as they reach a large number of consumers and both false negative and false positive decisions can lead to serious individual or social damage.

The DSA takes this into account in particular through numerous transparency obligations that can make the processes more comprehensible.