News 12 April 2022
Actors’ views : Michael Murray
The Age Appropriate Design Code (or “Children’s Code”) provides data protection standards that online services have to respect in order to protect their young users. Since September 2021, the one-year transition phase of the Code has ended, and online services are supposed to be conforming with it. Is it the case and how is the ICO controlling the implementation of the Code?
From 2 September 2021 the Children’s code came into force, meaning the Information Commissioner must take the code into account when considering whether an online service has complied with its data protection obligations. The courts must also take the provision of the code into account, when relevant, from this date.
Since moving into its supervision phase six months ago, the Children’s code has helped companies shape their online services to better protect children’s data. Research conducted by the Information Commissioner’s Office (ICO) shows that 2 in 5 companies have already made changes to how they process children’s data, and how they design their services to meet children’s needs. Our surveys have found that many companies are in the process of reviewing the risks to children, revising and redrafting their privacy information and developing their data protection impact assessments.
In the months after the code entered its supervision phase late last year, we undertook a review of online service sectors to identify services that we would prioritise for proactive engagement. Our research included undertaking risk assessments and identifying which services were most widely used by children to consider which were more likely to cause most harm to them. The sectors identified for proactive supervision were Games, Social Media and Video/ Music Streaming platforms.
The ICO contacted 40 online services in our priority sectors to establish what steps they had taken to implement the code, including questions on whether privacy notices had been adjusted to improve ease of understanding, how community standards were being upheld, and what measures services had put in place to protect children from profiling. Out of the initial 40 companies, about a quarter were identified as generally having good practice. About 42% had some areas of concern, while about 1 in five have shown more significant conformance issues. We are still awaiting detailed responses from a few companies. We are continuing to work with companies that have conformance issues to support them to improve their practice. Where especially egregious and persistent issues are identified, the ICO is able to apply the full range of its regulatory powers.
In addition to this close supervision work, the ICO is offering voluntary audits to companies interested in testing the systems and improvements they have put in place to conform with the code. We’ve had a few audits completed with games companies to date. Further engagement with large social media companies is managed by our Digital Economy team.
The ICO remains receptive to complaints from the public about non-conformance with the Children’s code, and we will continue to look for code conformance where there are data breaches associated with children’s data.
Does the ICO encounter any difficulties in controlling the implementation of the code?
Our findings so far demonstrate that many organisations have gone some way towards making sure they are conforming with the Children’s code, but there is room for improvement. We expect organisations to be able to show improvement and the steps they are putting in place to protect children’s privacy and data. Our finding that over 60% of organisations needed to do more work to improve their Data Protection Act / UK GDPR compliance in light of the code suggests that industry still has a way to go to apply the 15 standards of the code.
We are still early in the supervision stage of the code, but the results of our supervision activity have given us a good understanding of how organisations are adopting the Children’s code into their business models.
It’s important to understand that we won’t be giving any organisation a “gold star” for conformance. The code will continuously need to be referred to and built into the design and management of online services, and the way in which they process children’s data.
Most of our stakeholders in the industry and civil society have been supportive of the code and its intentions. They recognise that children need enhanced protection online. Our main challenges to date are increasing awareness of the code amongst smaller companies, and learning how a wide range of online services process children’s data. This includes how data is shared across often complex and opaque online ecosystems.
The main lessons learned are that some actors in the industry don’t fully grasp that they are in scope of the code’s “likely to be accessed by children” guidance, that some companies have been slow in preparing for conformance, and that companies are having issues with age appropriate design and age assurance. Still, we are seeing good progress, and this is a big change for many in the industry. We’ve already seen some significant changes by “Big Tech” in response to the code.
In the social media sector, responses have shown that some security measures are in place to better address risks to children. Community standards and Terms of Service tend to be transparent, and controls are in place to avoid potential physical harm to children. There is also an availability and prominence of reporting tools. However, there are areas that could be improved on within the sector, including age assurance measures to enforce terms of service, the ongoing use of profiling for content and monetisation, and third party sharing of data that is not off by default.
In the streaming sector, the organisations we contacted have demonstrated good practice in their adoption of privacy by design and having transparency of privacy information available to users. However, more work is needed on ensuring that children’s profiles are automatically being set to private, and more work is needed to address the efficacy of parental controls.
The games sector has demonstrated good practice in having prominent and effective reporting tools for inappropriate comments or content, having children’s profiles set at high privacy by default, and ensuring that geolocation is switched off by default. There are, however, areas for improvement, including the efficacy of parental controls – especially for young people in the 13-17 age bracket; age verification, which tends to be a simple self-declaration; and transparency information, which needs to be adapted to suit the reading age of the children using their services.
In France, part of the debate around the protection of children online focuses around age assurance and age gating. Still, no satisfying way forward seems to have been found so far in this regard. What is the ICO’s stance on those issues?
Standard 3 of the Children’s code states that online services likely to be accessed by children should take a risk-based approach to recognising the age of individual users and ensure they effectively apply the standards in this code to child users. This means that online services should either establish the age of their users with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from data processing, or apply the standards in this code to all users instead.
We do not want online services to simply age gate children out of the internet, except where services being offered are of the type that we would not want children to engage with in any circumstances. Where children are likely to access services, and these services can be made suitable for children, the best method to make the service age appropriate is to apply the standards of the code to all users. However, we recognise that this is not always possible, and some form of age assurance may be required to ensure that young people can access services that are appropriate for them, while protecting their data.
Our experience to date shows that the current paradigm is to use self-declaration as the primary tool for age assurance. In some cases, where risks are low for children, self-declaration is an acceptable form or age assurance. Where risks are higher, the ICO expects online services to employ more effective measures. We recognise that age assurance methods may still need further development to improve efficacy, so we expect online services to identify in their data protection impact assessment (DPIA) what risks to children’s data are, and how they mitigate those risks. This is an area where we would like to see improvements in practice, so we are working with key stakeholders to improve our understanding of what is possible with the current technology so that we can apply that learning to the supervision of the code. The ICO has published a Commissioner’s Opinion on Age Assurance that outlines our views on what online services should do to conform with the code.
The ICO recognises that age assurance systems are a developing technology, and that many organisations will want to adopt a global approach, rather than country specific approaches. We understand that initiatives such as the euCONSENT project and developing international standards will inform this work.
What do we know so far about the future UK data protection reform and how it may affect the work of the ICO?
It is right for the UK government to keep the legislative framework for personal data under review, considering how it can be improved while maintaining high standards. We are working closely with the government to give our insight and advice as the independent regulator into how reforms could protect the information rights the UK public value while supporting economic growth, businesses and public services.
More detail can be found in the ICO’s response to the consultation launched by the Department for Digital, Culture, Media & Sport (DCMS) here. The government is currently analysing the feedback to the consultation and we expect more detail on next steps in due course.
-
Publication 29 April 2020
Cyberbullying: A review of the literature