Responsible digital innovation means businesses should keep up with ethical, rights-based and legal frameworks and guidance so that children’s digital lives are enabled and empowered by design. The principle of responsibility emphasises that relevant stakeholders (or, in child rights language, ‘duty bearers’):
Know of and comply with laws, regulations, industry standards and other measures to ensure the realisation of children’s rights.
Provide children with accessible and safe pathways to meaningful remedies if things go wrong
Relevant legal frameworks and guidance
There are various laws, regulations and standards that digital innovators need to consider, including data protection and privacy laws, non-discrimination, product safety and consumer protection laws, among many others.
For example, digital providers operating in the UK must ensure their processing of personal data complies with the UK General Data Protection Regulation (GDPR) and Data Protection Act 2018. Ensuring compliance with these laws will help compliance with the Privacy and Electronic Communications Regulations (PECR), which set out rules ‘on the use of cookies and other technologies which rely on access to user devices, and on electronic marketing messages’. The AADC is the UK statutory code of practice for designers and developers of digital products and services likely accessed by children. It takes a child rights perspective.
The UK Equality Act 2010 places a duty on service providers to make their services accessible and inclusive. To comply with the UK Equality Act 2010, digital innovators must take steps to provide equal experiences to all children using their products or services. Particular groups of children should not be disadvantaged compared to other children because of their protected characteristics (e.g., learning disabilities). This includes making reasonable adjustments, and anticipating unintended discrimination or exclusion (e.g., being filtered out of a recruitment process). This is also important for the curation and data labelling for machine learning (ML) and artificial intelligence (AI) in any product or service.
The UK Online Safety Bill sets out responsibilities for digital providers to keep children safe online. Protection of children against commercial (economic) exploitation (e.g., by disguising providers’ promotion of products, services or in-app and in-game purchases) is supported by the UK Consumer Protection from Unfair Trading Regulations 2008, SI 2008/1277(CPUTR 2008) and the Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013, SI 2013/3134 (CCR 2013). The CCR 2013, Consumer Rights Act 2015 and CPUTR 2008 prohibit unfair and misleading commercial practices and coercive sales techniques.
Relevant to some products and services are Ofcom’s On Demand Programme Service Rules and video-sharing platform regulation, and the EU General Product Safety Directive 2001/95/EC.
Multiple voluntary codes of practice, standards and guidelines offer guidance for innovators. For example, the Broadband Commission’s Child Online Safety report requires ‘built-in protections’ against the 4Cs of online risks using relevant technologies, such as blocking, filtering and web crawlers.
There is also guidance for innovators on how to provide safe and accessible pathways to meaningful remedies. This recommends taking an active role, for instance, by establishing ‘company-based grievance mechanisms’ that include ‘consumer or user complaints processes, terms of service enforcement processes … systems for handling privacy related issues … and for monitoring and enforcing community conduct standards (including content moderation)’. Conduct monitoring is crucial to ensuring user safety in social media service provision:
Social media providers should maintain a clear and accessible reporting process to enable individuals to notify social media providers of harmful conduct.
For company-based grievance mechanisms to be effective, the UK Government requires (1) a procedure to enable remediation and (2) a meaningful outcome for those seeking redress – in this case, children.
In line with the UN Guiding Principles criteria for effective redress mechanisms, the Australian Safety by Design principles (Principle 2.2) recommend that innovators:
Establish clear protocols and consequences for service violations that serve as meaningful deterrents and reflect the values and expectations of the user-base.
Crucially, redress mechanisms for children need to be:
Safe: children are free from fear of negative consequences as a result of complainging
Accessible: easy for children to use
Accountable: children get effective remedy and redress.
Similarly, the IEEE Standard for an Age Appropriate Digital Services Framework sets out six actions for providing effective moderation and redress:
Provide prominent, accessible and easy-to-use tools to help children and parents/caregivers seek redress
Provide children and parents/caregivers access to expert advice and support where needed
Have clear penalties applied fairly and consistently
Offer opportunities to appeal decisions and escalate unresolved appeals to third parties or regulators
Reasonable resonse times
Provide children and parents/caregivers with an opportunity to correct their digital profile/footprint and termination rights.
Navigating the complex legal, regulatory and standards landscape applicable to digital products and services can be daunting for innovators, especially start-ups. So the first port of call could be a trade or industry association, if you are a member. For example, the Association for UK Interactive Entertainment (UKIE) provides guidance on the application of relevant regulations for its members.
A Child Rights Impact Assessment (CRIA) is a tool commonly used in policymaking processes to anticipate the likely impact on children. The CRIA has eight practical steps for innovators to follow and is now being adapted and applied to digital innovation by a growing number of businesses.
How can you be responsible when designing your digital products and services?
Here are some questions to ask yourself throughout your design process
Discover
Insight into the problem
What legislation, regulations and industry standards should you comply with?
How do these relate to the features and functionality of your product or service?
Define
Decide what to build
What design features, functions, safety, data protection or privacy mechanisms should you use to comply responsibly with relevant regulations and standards?
Develop
Try potential solutions
Are some potential solutions to legal or policy challenges better supported by evidence or good practice?
How can a CRIA help you evaluate your potential solutions?
Deliver
Solutions that work
Check: have you involved your legal department to be sure your product or service is compliant and responsible?
Chan children report problems and get help and redress?
How will you stay up to date with policy developments?