What is the age-appropriate design code – and how is it changing the internet?
Data drives the digital world. Every time you open an app, buy from a website or use a search engine, data is gathered about who you are, where you are and what you’re doing. Platforms and services use this data to shape users’ experiences online – from the notifications you are sent to the adverts you see. But what about children’s data? How should this be collected and used – if at all?
New UK guidance will require a sea change in how children's data is handled and protected online.
The age-appropriate design code, informally known as the Children’s Code, will require “information society services” such as websites, apps, online games and online video platforms to “put the best interests of the child first” when designing and developing their services.
What is the code, and why is it needed?
Children need specific safeguards online, just as they do in other areas of their lives. But according to the Information Commissioner’s Office (ICO), children are “using an internet that was not designed for them”.
To this end, the code introduces 15 “standards” that companies should adhere to in order to create “a safe space for [children] to learn, explore and play… not by seeking to protect children from the digital world, but by protecting them within it.”
The “standards” include things like:
- Switching off geolocation services by default.
- Ensuring children are aware when they are being monitored by built-in parental controls.
- Conducting data protection impact assessments on any data processing, accounting for differing ages and developmental needs.
- Not using nudge techniques to encourage children to provide unnecessary personal data, or weaken or turn off their privacy settings.
- Companies who do not conform to it could be fined under GDPR legislation.
Who – and what – will it apply to?
The code applies to anything online that may be accessed by someone under the age of 18 – even if not specifically designed for them, and even if the business or platform is based outside of the UK. It comes into force in September 2021 after a 12-month transition period.
How will it be implemented?
Tracking, collecting and selling users’ data is fundamental to the way the majority of tech businesses operate – but this model will no longer be acceptable for under-18s.
The code does not set out rules on how companies should implement its standards.
Businesses could, for example, decide to block under-18s from accessing certain content. But locking young people out of large parts of the internet is not necessarily the answer. This prevents them from learning through exploration, developing their understanding of risk and building their critical awareness of what is safe and unsafe online – in other words, building digital resilience.
Another potential approach is self-certification: asking young people to confirm and verify their age. For example, many social media platforms adhere to COPPA guidance in the US by stating they are for users aged 13 and above. We know these age limits are open to abuse – young people can, and do, find ways around them. There is a risk that driven off the adult internet, children and young people will find ways of accessing the internet that may be less safe.
Businesses may decide to introduce technical methods to verify users’ age – but this could actually involve collecting more sensitive data, rather than less. Users would need to provide information such as a real-time photograph, a passport or other form of personal identification to prove their age. Safety tech is a rapidly developing field and may prove effective, but it's largely untried – and parents may object to this kind of data being collected from their children.
Listen to Parent Zone's podcast, Tech Shock.
What happens next?
We support the code’s broad aim of safeguarding children's data. Facebook, YouTube and TikTok have already shared updates prompted by the code, covering everything from default private accounts to limits on advertising and time cut-offs for push notifications. However, without answering questions around age verification, much remains to be resolved as to how the code will work in practice.
The ICO says the code “will lead to changes that will help empower both adults and children”. Locking children out of online spaces is not, though, the best way to empower children online.
Relying too heavily on the code to regulate childrens’ online experiences could begin to substitute parental involvement, encouragement and support. Similarly, parents may feel undermined by settings that automatically notify children when parental controls are turned on, or when their movements are being tracked.
“Safety tech” may sound promising, but will need to be sophisticated to succeed, not least because every child is different and sweeping categories based on age and developmental stage can be crude in practice.
Increasing the responsibility on companies to safeguard children online – while well-intentioned – must not undermine parental involvement. Children need to develop the skills and critical thinking to identify and avoid inappropriate content or potentially risky situations, and know how to respond if they do. For parents, the priority should be to talk to their child regularly about what they are doing online, and to let them know they can always come to them if something concerns them.
Latest Articles
The Tech Shock podcast – the 'wicked problem' of child financial harms
This week Vicki is joined by PUBLIC's Maya Daver-Massion and Zixuan Fu to unpack child financial harms.
The Tech Shock podcast – has media literacy’s time finally come?
Vicki and Geraldine are joined by Professor at Bournemouth University, Julian McDougall, to discuss all things media literacy.
The Tech Shock podcast – the emerging gender divide
Rosie Campbell, professor of politics and director of the Global Institute of Women's Leadership at King's College London, joins Vicki to discuss gender and online life.