fbpx

A Guide to the Children’s Data Protection Rights

The UK’s Information Commissioner is identifying children’s data as “a regulatory priority”.

GDPR requires for the data controllers to provide certain information to individuals (“data subjects”) at the point at which personal data is collected from them. For those websites that target children – or those which do not, but are still likely to attract them as visitors – the bar is set higher: GDPR requires that the provision of privacy information must take into account that children have more limited awareness than adult visitors, and therefore will need a more accessible and clear framework. Despite GDPR and growing scrutiny surrounding the use of children’s personal data, it appears that, in very many cases, websites and apps popular with children are ignoring this requirement. This puts those who operate those sites, or who use them to sell or promote their services, at considerable legal and regulatory risk.

Children’s privacy – a fast moving area

More recently, press reports have suggested that children are being “datafied from birth” and tracked by thousands of apps. In February 2019, the UK Children’s Commissioner issued her proposal for a statutory duty of care between “Online Service Providers” and their young users, which was subsequently picked up by the Home Office in its White Paper on Online Harms. The White Paper recommends that a statutory duty of care be introduced, to be policed by an independent regulator who would be funded by industry. Companies will be required to demonstrate their compliance with this duty of care, including by designing products and services to make them safe for children.

The Council of Europe has issued recommended guidelines to Member States to respect, protect and fulfil the rights of the child in the digital environment. The first fundamental principle of these guidelines is that “in all actions concerning children in the digital environment, the best interests of the child shall be a primary consideration”. This has been seconded by the UK’s Information Commissioner in her proposed Age Appropriate Design Code.

The UK’s Information Commissioner’s Office (the “ICO”) is working on its proposed “Age Appropriate Design Code”, currently in draft form (the “Draft Code”), and is consulting with parents, carers and children to  finalise it. The Draft Code will provide practical guidance on the design standards it will expect providers of online “Information Society Services”, which process personal data and are likely to be accessed by children, to meet. An “Information Society Service” is defined as “any service normally provided for remuneration at a distance, by means of electronic equipment for the processing (including digital compression) and storage of data, and at the individual request of a recipient of the service” (Electronic Commerce (EC Directive) Regulations 2002). Examples include online shops, apps, social media platforms and streaming and content services. The ICO considers this definition covers most online services, even where the “remuneration” or funding of the service does not come directly from the end user. The Draft Code is applicable where children are likely to access a particular service, even if they represent only a small proportion of the overall user base.

 

The Draft Code contains 16 cumulative and interdependent standards[1] of age appropriate design, including that settings should be “high privacy” by default and any parental monitoring controls should be made clear to the child. All standards must be implemented to demonstrate compliance with the Draft Code.

 

The first of the 16 standards is that the best interests of the child should be a primary consideration and that “it is unlikely…that the commercial interests of an organisation will outweigh a child’s right to privacy”. This is a bold and radical statement of which data controllers should be aware and is perhaps indicative of how seriously the ICO is looking to take children’s privacy. Failing to meet the data controller obligations towards children for fear of jeopardising commercial interests, or because it is too difficult to open up the black box of processing activities, is unlikely to be an acceptable justification.

 

The Draft Code (a requirement of the Data Protection Act 2018) is expected to be published by the end of 2019. Under the Draft Code, the Information Commissioner must, when exercising her regulatory functions, take account of any of its provisions which she considers to be relevant. The Draft Code may also be submitted as evidence in court proceedings, and the courts must consider it wherever relevant. Data controllers that ignore their obligations towards young data subjects may ultimately invite regulatory action by the ICO.

 

The UK’s Children’s Commissioner’s “Who knows what about me?” report found that children between 11 and 16 years old post on social media, on average, 26 times a day – if they continue at the same rate, that is a total of nearly 70,000 posts by age 18. This is a huge amount of data that children are potentially unwittingly giving up on social media. Data controllers therefore must have systems and processes in place that allow them to update their young data subjects regularly on the data processing activities taking place and to allow them to change, or even erase, their digital footprint.

 

If we have children’s consent – are we ok?

 

The UK’s Data Protection Act 2018 provides that, if consent is being relied upon in relation to offering Information Society Services directly to a child, the child must be 13 or older for their consent to be valid. The age at which consent can be provided by a child may differ across European member states.

 

Where a child provides consent, the competence of the child (i.e. whether the child has the capacity to understand the scope of the data processing and the implications of the data collection and processing) must be considered. Arguably, consent is unlikely to be effective where there is complex information presented to children, or where they lack the ability to provide or withhold consent and still receive the requested services.

Companies must implement appropriate steps to verify the age of the child consenting. The exact age does not need to be confirmed, just that the child is old enough to provide his/her own consent. The Draft Code specifically states that asking a child to self-declare his/her age or age range will not constitute a robust age-verification mechanism, and companies must be able to demonstrate that children cannot easily circumvent age checks. Companies will therefore need to implement sophisticated technological solutions to identify sufficiently accurately children’s ages.

 

The ICO recommends that companies consider the degree of risk that the collection or use of the personal data poses to the child or others. Where the risk is low, and minimal information is being collected from the child – e.g. an email address to register for a newsletter – then asking the child to tick a box to confirm parental consent or his/her age may be sufficient. However, if more intrusive data processing activities are taking place – e.g. the child is posting personal data in an unmonitored chatroom – the ICO states that it will be necessary to verify that the child is old enough to provide his/her own consent or to check the identity of the person claiming parental responsibility and confirm the relationship between this person and the child.

 

Additionally, controllers must be able to show that consent to the specific processing operation taking place was “freely given, specific, informed and unambiguous”. The challenge for data controllers offering services to children is, therefore, to communicate clearly and simply the extent of their processing activities in a way that children can understand. Children can, where appropriate, consent to their data being collected and how it will be used, and understand the consequences of providing their consent, including any associated risks of the processing.

 

Even where controllers do not rely on data subjects’ consent for their data processing activities, but rely on another lawful basis for processing, data controllers must still provide certain prescribed information to data subjects. This information must be in line with Article 12 GDPR, which requires controllers to provide information to users in a “concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child.” Of course, this is not a wholly new requirement – the predecessor data protection regime also mandated it, although in less explicit terms. Recital 58 GDPR provides further guidance: “any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand”. The information provided should assist children or their parents to make properly informed decisions about whether to provide the information required to access the service and to continue to use it.

 

In all circumstances, regardless of whether data controllers are relying on children’s consent for the data processing activities, controllers must communicate their intended use of the children’s data in a clear and easily understandable manner and must not exploit any imbalance of power between them and the children. The Draft Code recommends that controllers carry out user testing to ensure that information is provided sufficiently clearly for the age range in question.

 

In addition, the Home Office’s White Paper on Online Harms recommends that the proposed independent regulator introduces a code of practice that sets out, amongst other things, guidance about how to ensure that terms of use are adequate and understood by users when signing up to the service. It is therefore clear that the transparency requirement will remain a key obligation with which data controllers must comply.

Source: https://www.mishcon.com/news/childrens-data-protection-rights-a-data-protection