Privacy-respecting innovation starts with business models that align with lawfulness, fairness, transparency, data minimisation, purpose and storage limitations. Privacy-by-design manifests through design features that give users meaningful control over the visibility, access and use of personally identifiable data. Privacy also requires security measures to prevent unauthorised access to data.
The principle of privacy-by-design draws on children’s right to the protection of privacy and image, which prescribes responsible handling of personal data, including:
Deployment of appropriate security measures to guard against unauthorised access to personal data
Compliance with data protection principles of lawfulness, fairness, transparency, data minimisation, accuracy, purpose and storage limitation
Respect for children’s agency, dignity and safety in the sharing and use of children’s data.
Making your product or service privacy respecting by design does not mean a blanket ban on data sharing. Nor does it mean added value cannot be generated from data processing. Baking privacy into digital products and services by design also means processing data fairly in a way that children would reasonably expect, collecting only the data necessary for the provision of your service, being transparent about how you process the data, respecting users’ choices, and keeping the data safe from unauthorised access only for the duration for which you need it.
Relevant legal frameworks and guidance
Children require specific protection of their personal data as they may be less aware of the risks involved, and less able to claim their rights.
Without transparency and informed consent from the child and their parents or legal guardians, data practices that constitute commercial threats to children’s privacy are likely in breach
of the UK Data Protection Act 2018, the UK GDPR and the UK AADC. These may breach the principles of lawfulness, fairness, transparency, purpose limitation and data minimisation in the UK Data Protection Act 2018, UK and EU GDPR. Profiling of data obtained from child users is also in breach of Standard 12 of the UK AADC, which requires options that use profiling to be switched ‘off’ by default. The Irish Fundamentals for a Child-Oriented Approach to Data Processing (the Fundamentals) offer clarification on digital innovators’ obligations under the EU GDPR and 14 actions to enhance the protection of children’s privacy in the digital environment.
Other international voluntary standards also provide technical measures for digital innovators to comply with data protection laws, particularly principles of accuracy, integrity and confidentiality in ISO 27001, the principles of purpose limitation, integrity and confidentiality in ISO27701, and requirements for privacy-by- design in ISO 31700. Taking a slightly different focus, the IEEE Standard for an Age Appropriate Digital Services Framework advises against specific design features – ‘inappropriate commercial nudging, profiling or conditioning’.
Specific to digital products and services used in health and social care, the UK Code of conduct for data-driven health and care technology advises the public sector to ‘consider only entering into commercial terms in which the benefits of the partnerships between technology companies and health and care providers are shared fairly’. Irrespective of application domains, digital providers should conduct a child rights-oriented data protection and privacy impact assessment to anticipate and mitigate the adverse impact of data processing on children and their rights.
Design cases
Threats to children’s right to privacy and data protection in the digital environment manifest in three dimensions: interpersonal, institutional and commercial. The interpersonal aspect of privacy that manifests through design features relates to friend or contact recommendation features based on various information sources, including shared contacts and profile information that users input to the systems and their interaction activities.
Online activity monitoring in parental control apps can also threaten children’s interpersonal privacy and even pose risks to children’s safety, as they request permission to read children’s calendars, contacts, audio records and the user’s exact location. Requests for parental consent for children to access ‘preventative or counselling services’ also undermine children’s privacy.
The processing of data about children by public authorities, such as schools, and the onward sharing of data with other public and private stakeholders, as well as data processing through children’s uses of EdTech in schools, could also threaten children’s privacy if not handled with care. Commercial threats include profiling users’ data to inform design decisions to prolong user engagement, manipulate users’ behaviours or serve users with targeted marketing or other forms of data monetisation. Data processing, even as part of the performance of digital technologies (e.g., health wearables), can have latent effects, resulting, for example, in the denial or adjustment of health insurance (see the case of Fitbit).
Instead, help children and caregivers keep control over children’s data. Give them choices about what data to share, and tell them how their data will be used.
Innovators should align their business models with data protection principles to make their data processing lawful, fair, transparent and specific to the original purpose. Remember to take only the data you need, keep the data only as long as needed, and ensure that children and parents/caregivers know your policy and how to seek remedy.