CKT

€405million fine signals strong message emphasising child-oriented approach to data processing

The Irish Data Protections Commissioner (DPC) has recently imposed a record €405million fine on Meta’s Instagram, pursuant to an investigation into breaches of children’s data privacy rights on the social media platform.

In this article, Clare Daly, Child Law Solicitor, CKT looks at some changes that have come into force already this year and the details of the recent case with Meta’s Instagram.

Increased emphasis on protecting children’s data

Globally, there has been an increased appetite by law makers both in terms of the protection of children’s data and also as regards the protection of children’s safety online. These two distinct yet parallel legislative intentions prioritise the safety and wellbeing of child users online. The Irish Online Safety and Media Regulation Bill was debated by Dáil Eireann in July 2022 and amendments were discussed.  In the UK, similar legislation has been paused and the earlier parts of this year saw much debate around child safety and calls for increased responsibilities on social media companies in particular, in order to enhance child protection online.

While the internet was designed for adults, the presence of children online is undeniable.

The recent Cyber Safe Kids Annual Report shows that;

  • over 95% of Irish children aged 8-12 own their own smart device, and,
  • 87% of 8-12 year old children surveyed use social media or messaging apps, while,
  • 47% of children aged between 8 and 12 are on TikTok.

In December 2021, the DPC launched the Fundamentals for a Child-Oriented Approach to Data Processing after an extensive consultation period, introducing 14 child-specific data protection principles, to enhance the level of protection afforded to children against data processing risks posed to them both online and offline. Read our previous article here.

In the UK, the Information Commissioner’s Office (ICO) Age-Appropriate Design Code came into force in September 2021. This Code contains 15 standards that online services need to follow, ensuring compliance with data protection law, to protect children’s data online. Online services covered by the code include games and apps, or any service that children are likely to access.

Highly sensitive category of personal data

The DPC investigation into Instagram (Meta) commenced in September 2020. The decision was recently reviewed by the European Data Protection Board (EDPB). A significant judgment by the DPC was recently published and it appears that the enquiry focused on two main issues:

  • Meta permitting children aged between 13-17 to operate business accounts on Instagram, which require and facilitate the publication of a child’s phone number and/or email address. In light of the risks that such information could pose to children, e.g. communication from dangerous individuals, the DPC considered the contact information of child users of Instagram business accounts is a highly sensitive category of personal data in the context of this processing.
  • Children’s personal Instagram accounts were set to public by default.

The decision included consideration of the fine of €405 million, as positioned against Meta’s annual turnover. Notably Meta’s Group of companies combined turnover for year ending December 2021 was approximately $117.929 billion.

It appears the finding will be appealed. Meta had recently launched enhanced child safety features on Instagram, including screen time limits and a ‘Take a Break’ feature.

More to Come

This is the largest fine ever imposed by the DPC and it signals a strong message to those processing children’s data.   The DPC has previously indicated its determination to drive transformation in how the personal data of children is handled. This determination is certainly highlighted in the recent record fine, which is required to be effective, proportionate and dissuasive. Hot on the heels of this decision, the DPC has recently announced a preliminary decision into an investigation into TikToks processing of children’s data. This investigation commenced in September 2021 and concerns TikTok settings and transparency as regards the processing of children’s personal data.  This decision is expected imminently.



[an error occurred while processing this directive]