11.Agency
Support child users’ decision-making and reduce exploitative features and business models that harm their agency.
Having agency means children can decide freely how to engage with the digital environment. This includes being able to start and stop using digital products and services of their choice easily, without feeling they are losing out, knowing and getting precisely what they have signed up for, and not being tempted, manipulated or nudged into doing anything that undermines their safety, privacy, development and wellbeing.
The principle of agency draws together two sets of children’s rights:
- Protection against economic exploitation: the right not to be subjected to unfair exchange.
- Protection against other forms of exploitation: the right not to be subjected to treatment that undermines children’s welfare.
Economic (or commercial) exploitation in the digital environment extends beyond the traditional notion of economic exploitation centred on child labour and manifests in various forms. It includes persuasive design to maximise children’s attention and monetisation of personal data as well as dark patterns and other features crafted to manipulate users’ choices. It also includes processing data for commercial purposes such as advertising without considering children’s vulnerabilities or benefiting children or their data unfairly.
“I usually search on [Google], and I often trust [Google] a lot… Please make less [adverts]”
Child aged 9-10, Greater London
“One bad thing is that you will end up going on it every day [and] you won’t go out, you would stay at home.”
Child aged 13-14, Yorkshire
Protection against such exploitation does not mean businesses cannot profit from their product or service offerings. But it does mean designing for the fair treatment of child users and fair processing of their data so that children are not treated unjustly, and nor are their vulnerabilities taken advantage of. Unfair or unjust treatment doesn’t mean actual detriment or harm has occurred. For example, a loss of opportunity (for the child) or imbalance of benefits gained (children vs. companies) can be ‘unfair’, making a practice exploitative.
“It [should be] made in a way that it is for me, as well as being they want loads of money.”
Child aged 12-13, Essex
Relevant legal frameworks and guidance
Commercial exploitation is generally prohibited under consumer protection, advertising and gaming laws and frameworks. Specifically, contexts of use, interfaces, design and structures that disguise the intention of service providers to promote additional or other products or services, including encouragement of in-game or in-app purchases and any ‘direct exhortations to children to buy’, are likely in breach of the UK Consumer Protection from Unfair Trading Regulations (CPUTR) 2008.
Significantly, relevant legal frameworks and guidance can differ by sector. For example, advertising to children is regulated by a mix of laws and self-regulatory codes in the UK (Code of Non-broadcast Advertising and Direct & Promotional Marketing [Committee of Advertising Practice (CAP) Code] supplements the relevant laws [e.g., the CPUTR 2008]).
Sector-specific codes also matter when showing compliance with the UK GDPR and the AADC. For example, the ICO advises considering sector-specific guidance and refers to the CAP Code when assessing whether data processing for advertising can harm children. Design features intended to prolong children’s engagement and the use of data processed from children to profile them for targeted marketing are in direct breach of Standard 5 (Detrimental use of data) of the AADC. Also relevant is the Office of Fair Trading’s ‘Principles for online and app-based games’.
Design cases
Not all persuasive or behavioural designs to improve user engagement or monetisation of personal or interaction data, for example for profiling, are problematic. Such design techniques and data uses become exploitative and encroach on children’s agency and rights when commercial interests are prioritised over a child’s best interests. This applies, for instance, when the product undermines a child’s choice to leave a situation freely or pushes them to make choices that are not good for them.
Our co-design workshops with designers from both large and small companies surfaced designers’ common challenge – the ‘balancing act’ between engaging interactions and users’ ‘autonomy’. Some referencing lines exist to determine what design practices constitute exploitation or unfair treatment. For example, Article 25 of the EU Digital Services Act (DSA) bans aspects of dark patterns – the design interfaces that manipulate users’ choices or decisions.
Other guidance also exists to inform innovators about exploitative practices:
Reduce compulsive features designed to prolong user engagement or cultivate dependency on games, apps or platforms, so children’s immersive play is intrinsically motivated and freely chosen.
Principle 7 (Prevent the profiling of children) and Principle 8 (Avoid the economic exploitation of children at all times) of the Dutch Code for Children’s Rights (Code voor kinderechten) advise against profiling and design features intended to cultivate children’s dependence on an app or a game, respectively.
“You have the right not to be exploited… Terms and conditions …should be easier to understand … because you’ve got, like, 10 pages through them. Just skip it. But there might be something in there
that’s not good. They should just make it a really short sentence that is quickly run through … because people have a lot shorter attention span.”Child aged 12-13, Essex
By building agency into digital products and services, innovators should avoid manipulating children’s decisions. Instead, they should prioritise design features that put children in charge of their digital experiences, and be upfront about their commercial intents.