Back to blog

GDPR & Children: Teaching Kids How to Lie on the Internet

Data privacy & security

Written by

Published May 18, 2017 · Updated January 22, 2018

GDPR & Children: Teaching Kids How to Lie on the Internet

Our “Smart” TV has its camera no more. We deactivated it when first setting up the device. Software updates tended to override that very setting. A cache was put on the camera. Finally, we exchanged the TV for another without camera or microphone – to find such a device, the local vendor had to fetch it from the back end of his warehouse!

My children are dubbed the iGeneration. They are technologically savvy, growing up with iPads and the Internet. They play on consoles that connect to the internet and rapidly figured out how to create their own network.

As they grow up, they are also being taught how to lie: by creating and using fake identities every time they look to access something online for which they need permission. Time spent on electronic devices is limited and password protected, selfies are not allowed, nor is taking pictures of body parts or disclosing any information about where we live.

The GDPR establishes in Article 8.1 that my parental obligations about data use by my children will apply at latest until they turn 16 years old:

“… the processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorized by the holder of parental responsibility over the child”.

There is a caveat to this: while the GDPR lays out rules related to privacy for all 28+3 countries (still) in the EEA – European Economic Area – some provisions of this legislation are left up to Member States.

This above provision is one of them, as article 8.1 ends with “Member States may provide by law for a lower age for those purposes provided that such lower age is not below 13 years”.
Note that the age of 13 aligns with US Federal law related to children, referred to as COPPA – the Children Online Privacy Protection Act. It is expected that most countries will lower the age of consent to 13.

There are 20 articles in the GDPR that give the Member States a measure of discretion. And while the GDPR is directly applicable as of May 2018 in all 31 countries (the UK included, despite Brexit), this piece of legislation still needs to be aligned with national law. The provisions that require further guidance from Member States are Articles 6, 8, 9, 10, 12, 14, 17, 23, 28, 29, 32, 35, 36, 37, 38, 42, 43, 45, 46, 47, and 49

As of the time of writing, only Germany and Slovakia have proposed their “local” flavor of the GDPR. Today, Ireland also released its version, yet without defining an age of consent for children.

How to Leverage Web Analytics

When Your Company is Dealing with Tons of Sensitive Data

Download FREE Guide

What is sensitive data?

Article 9, paragraph 4 on the processing of special categories of personal data additionally stipulates that “Member States may maintain or introduce further conditions, including limitations, with regard to the processing of genetic, biometric, data and data concerning health”.
Indeed, special categories of data, also often called sensitive data, have been expanded under the GDPR to include biometric and genetic data, keeping pace with scientific developments while allowing Member States to intervene.

And sensitive data, just like processing data on children, requires consent.

As Article 9, paragraph 1 states:

“Processing of personal data revealing racial and ethnic origin, political opinions, religious or philosophical beliefs, or trade union memberships, and the processing of genetic, biometric data for the purpose of uniquely identifying a natural person, data concerning health and data concerning a natural person’s sex life or sexual orientation shall be prohibited”.

Paragraph 2 then goes on to list all the exceptions for which paragraph 1 does not apply, of which the first of 10 is “the data subject has given explicit consent to the processing of those personal data for one or more specific purposes, …”

Note that while the special categories of data in the GDPR overlap with generally accepted “sensitive” data types in the US, political affiliation and trade union membership are not categories used in the USA.

Does this apply to digital advertising?

In case of doubt as to whether such consent obligations are applicable to the online advertising sector, please note that personal data under the GDPR includes online identifiers. The answer is therefore “yes, it does!”

Recital 30 further specifies that “Natural persons may be associated with online identifiers provided by their devices, applications, tools and protocols, such as internet protocol addresses, cookies identifiers or other identifiers such as radio frequency identification tags. This may leave traces which, in particular when combined with unique identifiers and other information received by the servers, may be used to create profiles of the natural persons and identify them”.

While creating segments not smaller than 100 cookies seems to be an understood practice in the digital advertising sector, there doesn’t seem to be a proper understanding of which categories should be excluded when treating data that addresses the most vulnerable, inflicting the most harm.

Even the NAI came out in their code of conduct in 2015 with recommendations about which
categories of data would require opt-in consent from individuals (sic):

Sensitive or Not? Data According to NAI*:

Sensitive or Not Data According to NAI
Sensitive or Not Data According to NAI

Be careful with predictions

Not only should companies dealing with profiles and segments be aware of the threshold of acceptability contained in the definition of data categories pertaining to sensitive data types or children – this also applies to predictive data.

One might remember the problem Target stumbled on before their huge data breach where, by using data from shopping behavior, a customer’s health state of pregnancy was deduced.
Moving from shopping behavior data categories to a health state crosses a line that has been defined in the majority of legislation, mainly to protect those seemingly innocuous pieces of data that can inflict the most harm.

GDPR & children: final recommendations

If you are in the business of selling or sharing user preferences or interests, if you are synching cookies, or if you are exchanging/sharing/selling profiles*, make sure sensitive data categories are either encompassed by the consent mechanisms that have been set forth within your contracts or not treated at all.

After all, the GDPR requires that compliance be demonstrated: you are guilty until proven innocent, not the other way around.

* writing this feels very strange for a European privacy expert

How to Leverage Web Analytics

When Your Company is Dealing with Tons of Sensitive Data

Download FREE Guide

Author

Aurélie Pols

DPO at mParticle

Aurélie Pols designs best data privacy practices: documenting data flows, minimizing data use risks, and striving for data quality. Aurélie follows the money to streamline data trails while touching upon security practices and ethical data uses. She leads her own consultancy, serves as DPO for New York-based CDP mParticle, was part of the EDPS' Ethics Advisory Group and now serves the European Commission as an expert in the Observatory of the Online Platform Economy.

See more posts by this author