Embed safety-by-design in product development and use.
Safety in digital innovation requires innovators to take preventive measures proportionate to the risks, remedies, support and care for victims. However, careful consideration is also needed to ensure that the protection of children does not come at the cost of children’s full enjoyment of the digital environment and other rights.
The principle of safety draws together three sets of children’s rights:
Protection against abuse and neglect: considering how digital technologies can be abused to facilitate violence and harm against children or to recruit children for extremist, terrorists or other violent activities.
Special protection against sexual exploitation and sexual abuse: including the use of digital technologies ‘to solicit children for sexual purposes and to participate in online child sexual abuse’.
Promotion of physical and psychological recovery and social reintegration of child victims: within an environment that encourages good ‘health, self-respect and dignity of the child’.
In the digital environment, risks of harm to children manifest in various forms, including ‘cyberaggression’, ‘cyberattacks’ and ‘information warfare’; digital technologies could also be used to facilitate ‘child trafficking’ and ‘gender-based violence’.
“Safety, meaning that if you are in danger, you can help yourself.”
Child aged 7-8, Greater London
“It could be violent games… somebody mentioned Call of Duty earlier. So if… a 10 year old or something found that on maybe their older brother’s or sister’s console or whatever and then started playing it, they’d see all of the violence and everything. And they’ll be desensitised to it. And then they’d see things like that later on and think, oh, that’s not that bad. And they wouldn’t really be aware of how dangerous it was.”
Child aged 12-13, Essex
Making digital innovation safe by design for children does not mean boxing children into a safe digital corner. Instead, embedding safety into digital innovation by design means thoroughly assessing the risks of both the intended and unintended use of digital technologies, and building proportionate measures to mitigate these risks and provide appropriate support and redress when harms occur.
“If the app hasn’t been fully developed, there could be gaps in its safety system, so there could be piracies, or people could be able to download things on to there that they shouldn’t be able to… If you see those things, … it could be ransomware, so you could have to pay money to be able to get things back. It could just completely take over your computer, and it could get all of your bank account details and everything like that. And then you probably won’t feel safe on the internet again for a very long time.”
Child aged 12-13, Essex
Relevant legal frameworks and guidance
To assess online risks to children and devise proportionate measures, digital innovators need to recognise different categories of risks: content, contact, conduct and contract. The UK Online Safety Bill prescribes ‘children’s risk assessment duties’ to assess and mitigate content, conduct and contact risks to children. For example, mitigating contact risks requires digital innovators to assess the risk of ‘functionalities enabling adults to contact other users (including children) by means of the service.’
“Tiktok, Youtube, Instagram can be bad because of bullying.”
Child aged 11-12, Yorkshire
“Instagram has this thing where someone DMs you, messages you, you have the option to say, no I don’t want them to message me, or yes, I do, which I think is good.”
Child aged 13-14, Essex
Design cases
While hate, cyberbullying and harmful online content (e.g., ‘proana’) are symptoms of social ills, digital innovation can form part of the solution. Design safe spaces for users, including children, while ensuring that abusive behaviour or harmful content does not reach them by taking safety-by-design measures.
Notice-and-take-down and moderation systems are standard measures referred to in legislation and voluntary guidelines. The purpose is to ‘detect, surface, and remove illegal and harmful conduct, contact and content with an aim for preventing harms before they occur’ and ‘feedback loops that inform users on the status of their reports [and outcomes]’.
However, the administration of moderation systems could interfere with children’s other rights, including freedom of expression and information. To address this, the Santa Clara Principles on Transparency and Accountability in Content Moderation emphasise:
Human rights and due process
Understandable rules and policies
Cultural competence
State involvement in content moderation
Integrity and explainability.
Other measures to mitigate online risks are also available. Examples include ‘blocking technologies’, ‘heuristic filtering’, ‘automated CSAM detection’ and ‘web crawlers.’
To build safety into digital innovation by design, digital innovators first need to understand the nature of online risks to children, assess them against the features and functionalities of their products and services, and devise proportionate measures to mitigate them. Innovators also need to support children in ways compatible with their needs and appropriate to to the risks.