The almost daily news stories on data privacy and data breaches – including those affecting children – raise urgent questions as everyday activities and actions generate data that are recorded, tracked, collated, analysed and monetised by a range of actors. Rishita Nandagiri, Sonia Livingstone and Mariya Stoilova discuss their systematic evidence mapping of studies of how children themselves understand their data and privacy online conducted as part of an ICO-funded research.
Often the digital pioneers, children tend to be the canaries in the coalmine of the digital age, encountering risks before many adults are aware of them or able to develop strategies to mitigate or tackle these risks. The Council of Europe’s Recommendation on children’s rights in the digital environment is a noteworthy advance in recognising children’s specific needs in relation to provision, design and regulation, as is the UK ICO’s recent consultation on an age-appropriate design code (our response is available here).
Alan Westin’s classic definition of privacy from his 1967 book Privacy and Freedom still works in today’s digital environment:
‘the claim of individuals, groups, or institutions to determine themselves when, how and to what extent information about them is communicated to others.’
But there has been a good deal of new research on privacy in the digital environment in recent years, including some focusing specifically on children. Here we highlight 11 studies that we have found valuable for insights regarding how children conceptualise privacy, their (often frustrated and resigned) interactions with digital environments as they negotiate privacy, their understandings of data, and the role of parents in children’s privacy online.
- Nissenbaum (2004) stresses that social contexts and context-relative informational norms are essential to privacy, developing the notion of contextual integrity. For our child-rights approach, it valuably sidesteps the popular charge that children (foolishly) either seek or eschew secrecy. Instead, applying privacy as contextual integrity to children’s judgement of what it is appropriate to share within particular contexts or relationship marks an important turn in digital environments where the child’s perspective is easily overlooked. Nissenbaum’s work influences our own conceptualisation of children’s privacy online.
- Kumar et al. (2017) build on the contextual integrity approach, showing empirically that children value their privacy and expect a degree of privacy online, in ways which mirror their experiences offline. Children demonstrated a limited but reasonable understanding of privacy online, but younger children (5-7 years old) have key knowledge gaps. As long known by child development theories, privacy is crucial for development. These new findings show how children’s digital literacies develop with learning and exposure, but still require scaffolding efforts by parents and carers.
Child development matters in digital environments
- Chaudron et al. (2018) conducted a study across 21 countries in Europe, exploring how children under eight engage with digital technologies. Children’s first contact with digital technologies and screens begins at an early age (below two years old), often through their parents’ devices. Children learn to interact with digital devices by observing the behaviour of adults and older children, or when invited to share family social media accounts, also developing their skills through trial and error. Nonetheless, such young children did not have clear understanding of privacy, or how to protect it.
- Livingstone (2014) identifies privacy as a core element of social media literacy, developing as part of children’s wider cognitive and social development. At around 9- to 11-years old, children’s sharing of personal data is guided by parents, but by 11- to 13, children experiment more, enjoying ‘risky opportunities’ and learning how to make decisions about online trust, as well as the consequences of wrong decisions. By 14-to 16-years old, children become critical of their earlier sharing practices, more independent of parental and teacher mediation, more aware of the consequences of online behaviour and more knowledgeable about navigating platforms, audiences and privacy settings to create the desired balance of public and private.
- Marwick and boyd (2014) argue that teenagers’ practices in networked publics are shaped by their interpretation of the social situation, their attitudes to privacy and publicity, their ability to navigate the technological and social environment and their development of strategies to achieve their privacy goals. In interpersonal situations, teenagers think more of what information to protect than what to disclose, with disclosure seen as part of a trade-off in which they may gain something – a new connection or a way to signal trust. For teenagers, privacy is understood in context – who is present, what is socially appropriate, their aim being less a matter of ‘hiding’ and more about asserting control.
- Steeves and Regan (2014) agree that, for children and young people, privacy is at the heart of forming relationships based on mutuality and trust. By contrast, the online relationships they have with school, marketers, potential employers or law enforcement agencies are instrumental, not recognising or protecting their social (or relational) notions of privacy. Information privacy and data protection policies “fail to capture the continuing importance of privacy after one consents to collection and disclosure… Consent in this case is not consent to an ongoing relationship with an organization but consent to that organization taking and using information for its own purposes” (p.306). Lack of privacy, particularly from parents, educators and employers, is resented by young people and seen as surveillance, and when different audiences collide in the online space, young people are left feeling vulnerable.
- Wisniewski (2018) analysed 75 commercial mobile apps on Android Play and found that most feature parental control (monitoring or restriction) rather than active mediation. Many apps, too, are extremely privacy invasive, providing parents granular access to monitor and restrict teenagers’ intimate online interactions with others. Children evaluate the apps much less positively than parents and experience them as restrictive and invasive. Wisniewski challenges the expectation that increased privacy controls will address the risks teenagers are exposed to, arguing that, while restrictive online practices reduce privacy risks, they also reduce the online benefits and do not teach teenagers to protect themselves effectively online.
Children don’t understand fully the ‘datafied’ digital environment
- The Norwegian Consumer Council (2018) describes how companies design environments that manipulate users, nudging them towards more privacy-intrusive options through the use of “dark patterns”- a deceptive interface design crafted to trick or nudge users towards sharing more personal data and threatening loss of functionality if more privacy intrusive options are not selected. Such tactics are deliberately designed in order to push users to share more data that they are comfortable sharing, raising serious questions about users’ right to choose.
- Kidron and Rudkin (2017) assert that children, navigating different online platforms, may be required to make a number of critical assessments and multiple acts of maturity in order to use a platform or site. Taken together, these call on children to consistently make “good” decisions. Despite children of different ages having different levels of maturity, capacity, and understanding, these chronological developmental differences are rarely reflected in online services. Some technological norms and interfaces prove especially problematic – “reward loops” or “priming”, for example, may disrupt or distract them for other tasks or activities or provoke anxiety for children seeking affirmation. Hence, digital environments should be designed taking childhood milestones into account.
- Selwyn and Pangrazio (2018) observe that data gathering occurs without user awareness via real-time tracking across platforms and locations, and is then used to create user identities (profiling), classify users, and filter online content via algorithms. This has been enabled by the rise of social media and smartphone use. Children in their study considered themselves relatively safe on social media, having little awareness of third parties and no objection to being targeted by advertising. However, children were concerned about their online privacy, developing their own “personal data tactics” yet still feeling insufficiently in control of their privacy.
- Bowler et al. (2017) explored young people’s (11-18 years old) data literacy as part of the Exploring Data Worlds at the Public Library Most found it difficult to understand data in concrete, personal terms. Most, too, imagine data as static, held in a single place, though a few described it as spread within a web across digital contexts. Teens had a broad understanding particularly of the start and end of the lifecycle of data, but little knowledge of data flows and infrastructure. While aware of security issues related to social media, they have spent little time thinking more broadly about their digital data traces or the implications for their future selves.
These key readings offer a flavour of the breadth of empirical research currently available on children’s privacy online and related components. Our systematic mapping of evidence (to be published in January 2019) hints how differences among children (developmental, socio-economic, skill-related, and gender- or vulnerability-based) influence their engagement with privacy online. Taken altogether, the findings pose pressing challenges for media literacy education and for data protection regulation. They also show how both kinds of policy response would benefit from greater attention to children’s voices and their heterogeneous experiences, competencies and rights.
This article gives the views of the authors and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Political Science. Images credit: Office of the Privacy Commissioner of Canada.