Publication 15 September 2022

Age assurance online: working towards a proportionate and European approach

AUTHORS

  • Jessica Galissaire, Studies Manager, Renaissance Numérique

  • Annabelle Richard, Associate Lawyer, Pinsent Masons

The legal provisions related to online age assurance are generally poorly complied with and are the subject of heated debate in France and Europe. Beyond its urgent nature and its omnipresence in the public debate, the issue of age assurance constitutes an interesting example of how public action and the various stakeholders involved deal with the presence of children online. In this publication, we explore the reasons why age assurance is poorly implemented at the moment and propose ways of making it more effective. More generally, this report puts forward recommendations on how to move towards an online childhood policy that respects the balance between the fundamental rights and freedoms of Internet users.

A protective legal framework, which establishes age assurance…

While children’s digital practices offer them immense opportunities to exercise their rights (right to education, information, freedom of expression, etc.), it can also expose them to risks : cyberbullying, online hate, grooming, exposure to illegal or harmful content, incitement to dangerous behaviour, addiction, exploitation of their personal data… For this reason, specific provisions for the digital environment, which introduce the need to check the age of Internet users, have been designed both at a European and national level.

Alexandra Mielle

Head of the "Audience Protection" department, French Regulatory Authority for Audiovisual and Digital Communication (Arcom)

“The media's view of children’s digital practices generally focuses on the most dangerous aspects. We often analyse the worst of these practices without also exploring the positive aspects. There is therefore a real need for young people and adults (including teachers) to take ownership of these practices, which have multiplied, while developing effective protection methods”

The General Data Protection Regulation (GDPR), for instance, states that children deserve specific protection with regard to the processing of their personal data. More specifically, Article 8(1) states that “the processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child.” In France, Article 45 of the Data Protection Act (Loi informatique et libertés) completes this provision by setting the age limit at 15 and by introducing the principle of “dual consent”: when the child is under fifteen years of age, the processing is only lawful if consent is given jointly by the child concerned and the person or persons who have parental authority over the child.

GDPR, Article 8(1)

“The processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child."

In addition, the revised Audiovisual Media Services Directive (the so-called AVMS Directive) introduces an obligation for Member States to take “appropriate measures to ensure that audiovisual media services […] which may impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them” (Article 6a).

Other texts relating to the digital environment which will soon come into force or are being discussed at a European level, such as the Digital Services Act (DSA) and the proposed legislation on artificial intelligence (AI Act), include specific provisions for children, in particular concerning the prohibition of using their personal data for commercial purposes. In France, Article 227-24 of the Penal Code, Article 45 of the Data Protection Act (Loi informatique et libertés), Article 23 of the law aimed at protecting victims of domestic violence, the law on “child influencers”, the law aimed at reinforcing parental control over means of access to the Internet, and the law aimed at combating school harassment, complete the international and European legal arsenal.

Michael Murray

Head of Regulatory Strategy, Information Commissioner's Office (ICO)

“There is a concern that age assurance not done well will end up being the next "cookie issue", resulting in friction that could lead to people not using the services.”

…but whose implementation is unsatisfactory

In terms of the national, European, and international legal framework for the protection of children in the digital environment, cyberspace is not a “no-go area” for this audience. However, legal provisions related to online age assurance, in particular Article 8(1) of the GDPR and Article 227-24 of the French Penal Code, are generally poorly complied with. The issue at stake here is thus one of lack of supervision and enforcement.

We identify three major obstacles to the effectiveness of existing measures:

  • the delicate balance between the protection of children online and other rights, such as the right to privacy;
  • certain stakeholders’ economic objectives;
  • and the relative lack of homogeneity of the legal framework in the European Union Member States, which makes compliance difficult.

In addition, some of the technical solutions used for age assurance are particularly intrusive and may lead to an imbalance in the guarantee of fundamental rights and freedoms.

Implementing a common framework of requirements at European level

To overcome these obstacles, we call for the implementation of a common framework of requirements at European level. The concept of proportionality and the accountability of online services providers are at the heart of this approach, which relies on three pillars :

RECOMMENDATIONS

#1 - Implementing a common code of conduct at European level

The minimum conditions that ensure age assurance is done effectively and in a way that is compatible with our fundamental rights must be specified and harmonised at European level. We encourage the European Commission, the Member States and all relevant stakeholders to explore the possibility of a binding code of conduct.

#2 - Conducting impact assessments rather than risk assessments

Providers of online services that may be accessed by minors cannot limit themselves to carrying out risk assessments. Impact assessments have the advantage of encompassing both the opportunities and risks minors may encounter online, as well as other key variables such as the impact of possible measures on other users, how easy it is to circumvent the measures being considered, the costs for the actors that have to implement them, etc. This evolution must also be integrated by the authorities responsible for supervising these services. There is therefore an urgent need to strengthen the human and financial resources available to them.

#3 - Imposing “strict” age assurance where legal provisions to restrict or prohibit access do exist

We recommend imposing a “strict” age assurance (i.e. verification) when legal provisions to restrict or prohibit access do exist. For operators providing such products or services (pornographic content, online betting, sale of alcohol, etc.), the need for age assurance is all the more critical as it aims to determine whether an individual has the right to access the product or service in question or not. However, such a measure requires age verification tools that are effective, not too intrusive, accessible to all and respectful of the balance between fundamental rights and freedoms. The work currently underway at French and European level to develop solutions that meet these requirements must therefore be supported.

More on this subject