Publication 3 July 2020

Moderating our (dis)content: renewing the regulatory approach

AUHTOR

  • Claire Pershan Project Manager, Renaissance Numérique

As online platforms become central to our democracies, the problem of toxic content on them threatens the free flow of information and the enjoyment of fundamental rights. In order to be effective, the policy response to this phenomenon must not ignore the specificities of the different platform operators, nor the interconnected character of content moderation. Renaissance Numérique urges European regulators and legislators to consider a range of platforms and moderation approaches in the regulation.

Considering moderation beyond the simple suppression of toxic content

The 18 June decision of the French Constitutional Council on the law to combat online hate reminds us of the risks that disproportionate measures may represent for our fundamental rights. At the same time, the debate is opening on the Digital Services Act. In this context, we call for regulation that includes a comprehensive approach to moderation processes. The regulation of online content must consider moderation in a broad sense, one that operates across all platforms holistically (as a set of decisions and processes). 

In addition, one of the principal issues at hand is to go beyond the concept of user thresholds (in other words the number of users per country per platform) : this concept is not well adapted. By itself, this figure does not illustrate the moderation problems platforms have to face. Regulation must thus include agile indicators to measure the reactivity of platforms vis-à-vis moderation challenges, which constantly evolve. 

Including all platforms in the discussions about moderation processes

Today, co-regulation remains bilateral as the process is largely kept between governments and major platform operators. Yet, future regulation in this area, in particular the European Digital Services Act, should not be shaped solely by and for the dominant platform operators.

While regulation introduces general obligations tailored to the biggest global internet actors, these measures will have negative and disproportionate effects on other actors. In fine, this would lead to a weakening of platform diversity. Public authorities must ensure that all platform operators  are consulted and taken into account in the elaboration of regulation, notably those with less resources to invest in this matter.

RECOMMENDATION

Public authorities must ensure that all platform service providers are consulted and taken into account in the elaboration of regulation, notably those with less resources to invest in this matter.

Platform governance must integrate user contributions

For platforms, content moderation requires defining the right balance and processes, hand in hand with public authorities, civil society and the end users. The notion of value co-creation is inherent to platforms that host content generated by their users.  

This substantial input from end-users should be reflected in the moderation efforts of platforms, for example by adopting a collaborative approach to moderation. However, this requires the establishment of real discursive processes between platforms and their users, and cannot be limited to the outsourcing of moderation tasks.


More on this subject