5. Responsible
Comply with legal frameworks, provide remedies as needed and conduct a Child Rights Impact Assessment
Responsible digital innovation means businesses should keep up with ethical, rights-based and legal frameworks and guidance so that children’s digital lives are enabled and empowered by design. The principle of responsibility emphasises that relevant stakeholders (or, in child rights language, ‘duty bearers’):
- Know of and comply with laws, regulations, industry standards and other measures to ensure the realisation of children’s rights.
- Provide children with accessible and safe pathways to meaningful remedies if things go wrong
“You and your parents [have the responsibility to keep children safe]. But also, the creators of the app, like the creators of Youtube, know that kids are going to watch that now… Then, the app maker should have another setting for kids.
Child aged 9-10, Greater London
Relevant legal frameworks and guidance
There are various laws, regulations and standards that digital innovators need to consider, including data protection and privacy laws, non-discrimination, product safety and consumer protection laws, among many others.
For example, digital providers operating in the UK must ensure their processing of personal data complies with the UK General Data Protection Regulation (GDPR) and Data Protection Act 2018. Ensuring compliance with these laws will help compliance with the Privacy and Electronic Communications Regulations (PECR), which set out rules ‘on the use of cookies and other technologies which rely on access to user devices, and on electronic marketing messages’. The AADC is the UK statutory code of practice for designers and developers of digital products and services likely accessed by children. It takes a child rights perspective.
“Apps have a responsibility to not leak data or personal information.”
Child aged 12-13, Essex
The UK Equality Act 2010 places a duty on service providers to make their services accessible and inclusive. To comply with the UK Equality Act 2010, digital innovators must take steps to provide equal experiences to all children using their products or services. Particular groups of children should not be disadvantaged compared to other children because of their protected characteristics (e.g., learning disabilities). This includes making reasonable adjustments, and anticipating unintended discrimination or exclusion (e.g., being filtered out of a recruitment process). This is also important for the curation and data labelling for machine learning (ML) and artificial intelligence (AI) in any product or service.
“Snapchat could make it more child-friendly so random people can’t just add you and can’t say mean things.”
Child aged 13-14, Yorkshire
The UK Online Safety Bill sets out responsibilities for digital providers to keep children safe online. Protection of children against commercial (economic) exploitation (e.g., by disguising providers’ promotion of products, services or in-app and in-game purchases) is supported by the UK Consumer Protection from Unfair Trading Regulations 2008, SI 2008/1277(CPUTR 2008) and the Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013, SI 2013/3134 (CCR 2013). The CCR 2013, Consumer Rights Act 2015 and CPUTR 2008 prohibit unfair and misleading commercial practices and coercive sales techniques.
Relevant to some products and services are Ofcom’s On Demand Programme Service Rules and video-sharing platform regulation, and the EU General Product Safety Directive 2001/95/EC.
Multiple voluntary codes of practice, standards and guidelines offer guidance for innovators. For example, the Broadband Commission’s Child Online Safety report requires ‘built-in protections’ against the 4Cs of online risks using relevant technologies, such as blocking, filtering and web crawlers.
There is also guidance for innovators on how to provide safe and accessible pathways to meaningful remedies. This recommends taking an active role, for instance, by establishing ‘company-based grievance mechanisms’ that include ‘consumer or user complaints processes, terms of service enforcement processes … systems for handling privacy related issues … and for monitoring and enforcing community conduct standards (including content moderation)’. Conduct monitoring is crucial to ensuring user safety in social media service provision:
- Social media providers should maintain a clear and accessible reporting process to enable individuals to notify social media providers of harmful conduct.
For company-based grievance mechanisms to be effective, the UK Government requires (1) a procedure to enable remediation and (2) a meaningful outcome for those seeking redress – in this case, children.
In line with the UN Guiding Principles criteria for effective redress mechanisms, the Australian Safety by Design principles (Principle 2.2) recommend that innovators:
Establish clear protocols and consequences for service violations that serve as meaningful deterrents and reflect the values and expectations of the user-base.
Crucially, redress mechanisms for children need to be:
- Safe: children are free from fear of negative consequences as a result of complainging
- Accessible: easy for children to use
- Accountable: children get effective remedy and redress.
Similarly, the IEEE Standard for an Age Appropriate Digital Services Framework sets out six actions for providing effective moderation and redress:
- Provide prominent, accessible and easy-to-use tools to help children and parents/caregivers seek redress
- Provide children and parents/caregivers access to expert advice and support where needed
- Have clear penalties applied fairly and consistently
- Offer opportunities to appeal decisions and escalate unresolved appeals to third parties or regulators
- Reasonable resonse times
- Provide children and parents/caregivers with an opportunity to correct their digital profile/footprint and termination rights.
Navigating the complex legal, regulatory and standards landscape applicable to digital products and services can be daunting for innovators, especially start-ups. So the first port of call could be a trade or industry association, if you are a member. For example, the Association for UK Interactive Entertainment (UKIE) provides guidance on the application of relevant regulations for its members.
A Child Rights Impact Assessment (CRIA) is a tool commonly used in policymaking processes to anticipate the likely impact on children. The CRIA has eight practical steps for innovators to follow and is now being adapted and applied to digital innovation by a growing number of businesses.