LSE - Small Logo
LSE - Small Logo

Sonia Livingstone

February 21st, 2018

More clarity brings more confusion: what the European General Data Protection Regulations mean for UK children

0 comments

Estimated reading time: 10 minutes

Sonia Livingstone

February 21st, 2018

More clarity brings more confusion: what the European General Data Protection Regulations mean for UK children

0 comments

Estimated reading time: 10 minutes

The debate around the upcoming General Data Protection Regulation (GDPR) is leaving many questions unanswered. In this post, Sonia Livingstone and DaYoung Yoo reflect on the discussions surrounding the forthcoming regulations and what it will mean for children. Sonia is Professor of Social Psychology at LSE’s Department of Media and Communications and has more than 25 years of experience in media research with a particular focus on children and young people. She is the lead investigator of the Parenting for a Digital Future research project.  DaYoung is an LSE MSc Media Communications Governance student [Header image credit: P. Rogozhin, CC BY 2.0]

For some policy areas, it seems that the closer you look, the more confusion results. So it seems with the General Data Protection Regulation, coming into force in May – especially as it applies to children.

In our recent roundtable discussion, we invited experts from the Information Commissioner’s Office to present their thinking to a multi-stakeholder audience, in advance of a public consultation process (now open). Building on an earlier roundtable and new report, and some subsequent lively discussion, we hoped this new roundtable would replace confusion with clarity.

As set out in our new report, some things are now clearer, and others are not. Let’s start with the former:

First, in case it wasn’t already obvious, key human rights organisations have reminded everyone concerned that children (like everyone else) have the right to privacy – including in relation to business and the digital environment – see UNICEF’s statement, that of the Children’s Commissioner for England, the recent report by the House of Lords, and also deliberations at the Council of Europe.

Second, Article 29 Working Party information is now available online including Article 29 Working Party consent guidance and Article 29 Working Party draft guidelines on profiling. As mentioned earlier, ICO has published Children and the GDPR guidance for consultation which closes on 28 February 2018. The guidance includes new rules about automated decision-making, the right to erasure and consent. In the meantime, the UK government has recently announced that it will support an amendment to the Data Protection Bill to impose a stricter code of practice for protecting children’s privacy online – by design, with the focus especially on provision for 13-17 year olds who are above the age of consent but still children.

Last, also clearer is the age of consent that many countries are now choosing for children’s access to information society services (as provisionally indicated in the figure below – see here).

Each country gives a different reasoning (if it gives any), perhaps because, it seems, none has consulted evidence to ground its decision:

  • In France, for instance, the lack of a convincing rationale to reduce the age below 16 is sufficient to stay with 16, arguing that this will encourage a productive involvement of parents in teenagers’ online lives (since parental consent will be required for children’s online activities);
  • In the UK, however, the lack of a convincing rationale against reducing the age to 13 (the current norm) was considered equally compelling, arguing that this will encourage online providers to develop protective tools (since in effect, responsibility for child protection is being shifted from parents to industry);
  • In the Czech Republic, the rationale for 13 is that teenagers are already using social media, and anyway the risks are not as great as activities for which a higher age is set (e.g. driving).

As a consequence, as each country makes its decision, reducing uncertainty for families in that country, other uncertainties arise. For instance, in countries where a higher age is chosen than current practice, will teenagers be unceremoniously thrown off services? What will happen to their photos, their data, their connections?

Obviously too, the differences across Europe raise questions both for providers and also for children who may access resources across countries (perhaps moving or even taking a holiday!) As revealed in our roundtable discussion, regarding the applicable jurisdiction there are three options – the domicile of the information society service or that of a child or the physical location of a child when she or he is actually using the service – and it’s not evident which one should apply.

Surprising to us as academic observers, aware that a lot of lawyers are now actively advising on all things to do with the GDPR, there’s a lot that’s still unresolved, with legal interpretations already differing on some key issues relating to children. These include clarity over the legitimate bases of processing data (i.e. when is consent required, and how frequently), and the practicalities of how a parent is to be identified and verified (i.e. linked to the right child). Most concerning, perhaps, is the question of how providers are even to know that a user is a child in the first place, so that child-appropriate protections can apply (age-verify every single user?).

As John Carr comments on the guidance notes from the Article 29 Working Party, the requirement of an impact assessment sounds promising but is currently unclear. He asks whether information society services “will need to consider each discrete and particular data processing activity that is possible on their site or within their service …to ensure they have completed an impact assessment for all of them and have obtained the appropriate permissions.”

Then, Recital 71 states that “profiling should not concern a child” whereas Article 22 indicates provided a decision doesn’t pose a “legal or similarly significant effect on the child”, data controllers are not prohibited from making profiling decisions about children. The Article 29 Working Party guidance says profiling affecting a child’s choices and behaviour, depending on their nature, has a potential risk of having a “legal or similarly significant effect”. So how will this be weighed and by whom?

Besides, there are some other issues remaining in question including the definition of information society services, how to deal appropriately with specific vulnerable children (rather than the ‘average’ child), underage individuals’ unauthorised usage (already a widespread practice), the consequences of the adjustment of the age of consent across countries, and the practicalities of giving parental consent.

Meanwhile, vast amounts of data are being collected from children, with or without consent, with or without knowledge that children are even using the services and, further, with or without adequate security provision, and without parental awareness of many of these issues. Further, while much of the meeting focused on consent-based data processing by the private sector, there are also a host of questions about data being processed – and often shared with third parties – in the public sector (schools being the most obvious case, often with little parental understanding). Data breaches seem an increasingly common occurrence. Will the GDPR solve these problems?

We note that many of the services and products which are under scrutiny when discussing the protection of children’s data also offer them freedom of expression as well as joy and value. While it seems obvious to us that, in consequence, children should not be excluded from the policymaking process, it appears that, unfortunately, children themselves have rarely if ever been consulted on their views about the unfolding policy that will manage both their opportunities and risks, notwithstanding good practice in deliberative policymaking, including with children.

Thus their understanding of their online privacy is not taken into account in these debates. On the bright side, this long-standing complaint is being rectified, for the first author has just been awarded a grant from the ICO to research exactly this – so, watch this space. But much else remains confusing and unresolved, making the ICO’s currently-open consultation all the more important. Please respond!

Notes


This text was originally published on the LSE Media Policy Project blog and has been re-posted with permission.

This post gives the views of the authors and does not represent the position of the LSE Parenting for a Digital Future blog, nor of the London School of Economics and Political Science.

About the author

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: In the news