CKT

New Online Safety Legislation Signed Into Law

The Online Safety and Media Regulation Act 2022 “OSMR Act” has been enacted. In this article Clare Daly, Child Law Solicitor at CKT gives an overview of the Act.

Online Safety

The new regulatory framework for online safety will be overseen by an Online Safety Commissioner (OSC), who will create binding Online Safety Codes to hold designated online service providers “Providers” to account for how they tackle harmful online content.

The OSC is also empowered under the Act to introduce an individual complaints mechanism on a phased basis, focusing initially on children and to order the removal or limitation of harmful online content, either on foot of a complaint or on its own initiative.

What is Harmful Online Content?

Harmful content is set out in Part 11 of the new Act and includes two main categories of content; 1) offence specific categories and 2) material that is otherwise harmful:

1: Offence Specific Categories sets out 42 different offences. A large proportion of these offences are offences against children, or provisions protecting the identification of child victims or child offenders. Notably the Act appears to be silent as regards identifying a child who is subject to an Order or proceedings under the Child Care Act 1991.

2: Other Categories of Harmful Online Content are set out as a two-tier category:

(a) The Online Content must be content which bullies or humiliates another person; promotes or encourages behaviour that characterises a feeding or eating disorder; promotes or encourages self-harm or suicide;  makes available knowledge of methods of self-harm or suicide.

(b) Online Content must meet the risk test if it gives rise to— (a) any risk to a person’s life, or (b) a risk of significant harm to a person’s physical or mental health, where the harm is reasonably foreseeable.

Notably this part of the Act deals with age-inappropriate content yet the Act does not provide for any age-verification measures, which is unfortunate. Earlier drafts of the Act sought to introduce robust measures to ensure a minimum age verification of account holders of 15 years old. This provision did not survive to enactment stage.

Removal of Content

The Act provides for an independent complaints mechanism. The complainant must have complained to the provider at least 2 days prior, and the complainant must engage with the providers online safety code for handling complaints.  However, the OSC can have regard to the rights of the complainants, to the interest of any child as a complainant or to the levels of risk of harm in particular harm to a child. Consequently, it appears that while the ideal position is that the internal complaints mechanism is followed, in some cases, expediency would require an immediate take down remedy.

Fines and Sanctions

The OSC has wide ranging powers under the Act, to include the ability to:

  • audit the designated service,
  • conduct investigations and inquiries of its own volition or at the behest of complaints,
  • compel a designated service to remove or disable access to harmful content,
  • seek leave of the High Court to compel internet access providers to block access to a designated online service in the State, and
  • give fines which can reach up to €20 million or 10% of ‘relevant’ turnover from the previous year, whichever is higher.

Where the OSC imposes a fine, the Provider can appeal ‘any financial administrative sanction’ which does not exceed €75,000 to the Circuit Court and anything higher must be appealed to the High Court.

Conclusion

This is much welcomed legislation which puts Ireland at the forefront of the global movement pushing forward regulation of online services, heralding an end to an era of self-regulation.

If you have a question relating to Data Protection/GDPR, or the new legislation, please contact a member of our Data Protection and GDPR team.



[an error occurred while processing this directive]