Introducing a comprehensive approach to privacy online
While a substantial amount of research is focused on children’s interpersonal privacy, much less attention is paid to institutional and commercial privacy, even though the evidence demonstrates that children struggle to fully comprehend and manage the commercial use of their personal data. A more comprehensive approach which tackles all dimensions of privacy in developing awareness and capabilities is needed to address these gaps.
A balance of protection and autonomy
A healthy balance between children’s independence and protection can foster their development, agency and exploration of the physical, social and virtual worlds. Policy and educational measures to ensure children’s privacy online and safety should also facilitate their autonomy, pro-active risk management and right to participation.
A child-focused approach
With growing concerns over children’s privacy online and the commercial uses of their data, it is vital that children’s understandings of the digital environment, their digital skills and their capacity to consent are taken into account in designing services, regulation and policy. A child-focused approach can give recognition to children’s voices and facilitate and support their heterogeneous experiences, competencies and capacities. It can also create opportunities of peer-to-peer support and a more inclusive and tolerant online environment.
Banning discrimination or less favourable treatment based on personal data
Getting access to personal data can result in future discrimination or less favourable treatment (e.g., in relation to education, employment, credit or insurance opportunities). Data provided during childhood can ‘follow’ individuals through their adult life due to the longevity of the traces left online. Therefore, policy attention needs to be focused on preventing less favourable treatment and discrimination based on harvesting personal data and using it in ways that go beyond its original intention, especially if this data is collected from a person under the age of 18. ‘Rights by design’ is vital so a child could check, contest, rectify, erase or edit information about themselves.
Digital skills and privacy education at an early age
Children start facing privacy decisions and risks as soon as they enter the digital environment, long before their media literacy prepares them to make decisions in their own best interests. Some studies demonstrate the effectiveness of interactive learning materials to introducing privacy-related issues (such as protection of personal information, online trust, location sharing, cyberbullying and passwords, digital trail) to children as young as 7. Privacy proficiency tests show significant improvement in children’s privacy knowledge and privacy-conscious behaviour retention after one week, highlighting the great potential of digital and privacy skills education at an early age. In addition, media literacy and privacy-related skills need to be enacted by children, rather than taught as external rules, and need to reflect the actual concerns and experiences of children. Children need to be able to make more autonomous decisions about effectively protecting themselves online, to gain experience in coping with unexpected or undesired situations, and to learn from mistakes.
Focus on individual differences and psychological factors
Supporting children by supporting adults
Adults are often left feeling ‘behind’ digital developments and struggling to identify the best ways to support children. A comprehensive system supporting both children and adults around them – parents, educators and child support workers – is a prerequisite for developing effective media literacy. Rather than focusing predominantly on parental mediation, a wider approach which engages children’s support networks in their full breadth can allow children in different circumstances to receive the support they need.
Improving the privacy affordances of the online environment
The available evidence also suggests that children are not fully aware of the threats coming from commercial entities that collect, record and aggregate data on their platforms, and nor do they fully understand how their data is used for economic profit by targeting ads or customising content. Further work is needed to increase the transparency of data collection, improve privacy control navigation, enable granular control over privacy settings to match the elaborate data-harvesting techniques and create better industry standards around user empowerment. Ease of use, ubiquitous functions and user-friendly features of the privacy setting interface may reinforce children’s privacy protection behaviours.
Children cannot be expected to be solely responsible for handling the complex commercial environment. This makes necessary the changes to the business model which would not only make personal data use more transparent, but would also enable children to engage more actively and agentically with the online platforms, raising their critical awareness. Some possible changes include:
The principle of data minimisation by default is crucial in ensuring that children’s data is gathered only when it is service-critical and is not shared with third parties, reducing the fake ‘voluntary’ data sharing by children.
‘Default’ settings can be improved by switching data harvesting and profiling off, safeguarding by default children’s personal data more efficiently, and protecting, in particular, children who do not know how to change their settings.
Hidden paid-for activities including in-app purchases are hard for children to identify and can lead to unintended exposure to commercial content, sometimes unsuitable for the child’s age. Transparency and age verification are needed to redress these issues.
Designing age-appropriate content needs to be an ongoing process that takes into account the wider digital ecology and children’s changing knowledge, needs and competences within the dynamic internet environment.
Location of responsibility should lie within industry, rather than children, their parents and educators. The focus should fall on the overall the design of online environment and its ecology, rather than enforcement of regulatory measures.
A close working collaboration between government, industry, educators and child representatives for creating a sense of shared ethical responsibility for delivering high-quality services to children is needed.
Better evidence base
The evidence mapping identified substantial gaps in existing knowledge in relation to all dimensions of privacy online, but particularly with reference to institutional and commercial uses of data. More research is needed to improve our understanding of how children’s developmental needs affect privacy risks and related media literacy; what skills are needed to protect online privacy and how best to teach these skills to children; what support strategies are most efficient in helping children to take advantage of the existing opportunities, avoid harm and foster resilience and self-efficacy; and what policies and regulations are best equipped to mitigate privacy risks and foster a safe online environment for children.