CKT

Children’s Online Safety in Ireland

Online safety presents daily as a topical issue resulting from the expanding role of social media in modern Ireland. This year the unsuccessful High Court challenge taken by X (formally known as Twitter) against Coimisiún na Meán represented the Media giants direct challenge to Ireland’s Online Safety Framework alleging “regulatory overreach”.  In refusing the reliefs sought by X last July 2025 the Irish High Court instead found that the Code was aligned with EU legislation and regulations seeking to safeguard the public in general and children in particular from online harmful content.

Read on as Denise Kirwan, Partner explores the role of Coimisiún na Meán and the provisions contained in Ireland’s Online Safety Code specifically those in place to protect children.

Role of Coimisiún na Meán?

Coimisiún na Meán is the regulatory body overseeing all digital services in Ireland. They have a supervisory, investigatory and enforcement role and must ensure compliance with Ireland’s Online Safety Framework.

Ireland’s Online Safety Framework is made up of the Digital Services Act (DSA) (the EU legislative provision), the Online Safety and Media Regulation Act 2022 (OSMR) and the EU Terrorist Content Online Regulation (TCOR).

Coimisiún na Meán highlights that the aim of the Online Safety Framework ‘… is to reduce the risk of people (especially young people) being exposed to illegal or harmful content online.’

Ireland’s Online Safety Code

The Code is based on the OSMR Act 2022 and is comprised of two parts, Part A imposes general obligations on platforms ‘… to, as appropriate, include and apply in the service’s terms and conditions requirements to provide certain protections to the general public and children’.

Part B establishes specific measures to protect children and the general public online. All platforms based in Ireland are obliged to implement all requirements specified in Part B of the Code.

Regarding Part B, Coimisiún na Mean states it ‘…contains requirements in relation to terms and conditions to address the uploading or sharing of restricted video content, restricted indissociable user-generated content, and adult-only video content…’

Part B also contains ‘…requirements relating to terms and conditions to address the access of children to certain services and how users comply with age assurance measures. Part B provides that these requirements shall not preclude the uploading or sharing of content as a contribution to civic discourse, provided certain protections are in place.’

The Code was adopted in October 2024 and came into effect on the 21st of July 2025. This 9-month period ostensibly provided platforms the time necessary to make systematic changes in order to comply with the Code and to engage proactively with Coimisiun na Mean in so doing.

The Online safety code is now fully enforceable by Coimisiún na Mean and any platform in breach are fined up to €20 Million or 10% of their annual turnover, whichever is greater.

Ireland’s Online Safety Code & Children

Content impairing the physical, mental or moral development of children

Part A of the code requires platforms to protect children or minors ‘…from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental, or moral development’.

Coimisiún na Meán listed the following prohibited content that’s likely to fall into this category;

  1. Where a person bullies or humiliates another person;
  2. Promoting or encouraging behaviour that characterises an eating disorder;
  3. Promoting, encouraging or teaching children methods of self-harm or suicide;
  4. Material that would impact the safety of a child such as dangerous challenges.

Platforms are also given discretion to establish more protections for children against other material they deem as impacting the development of a child. However, these additional discretions cannot impact the rights and legitimate interests of children using the service.

Age Verification

Part A and B impose age verification obligations on platforms to prevent children from accessing materials that would again impair their physical, mental and moral development.

Part A imposes general duties to have age verification systems in place for materials deemed to impact the development of a child.

Part B places a specific requirement where adult-only content is shared to have ‘effective age assurance measures’ that ensure such content cannot generally be accessed by children.

The code highlights self-declarations of age are insufficient to meet Part A and B’s obligations.

Appropriate age verification techniques under the Code include;

  • Self-Declaration;
  • Hard Identification i.e. passport;
  • Credit cards;
  • Self-sovereign identity;
  • Account holder confirmation;
  • Cross-platform authentication;
  • Age estimation;
  • Behavioural profiling;
  • Capacity-testing.

Parental Controls

Part A generally obliges platforms to provide parental controls which are under the control of users on material that will likely impact the development of a child.

Part B specifically requires platforms who allow users under the age of 16 have parental controls in place on video content and audiovisual commercial communications which impair the physical, mental or moral development of children. Part B also contains requirements for the purpose, function and provision of parental control systems.

Finally, the Code states that the personal information of children collected for the purpose of putting parental controls in place shall not be used for commercial purposes.

Conclusion

Ireland’s Online Safety Code represents a significant step toward safeguarding children and the wider public from harmful online content. As the enforcement powers of Coimisiún na Meán now include substantial financial penalties, this should enclosure social media platforms to act proactively to meet their obligations.

Useful resources:

https://hotline.ie

Online-Safety-Guidance-Materials.pdf

Online Safety Framework – Coimisiún na Meán