Clare Daly: Independent complaints mechanism a welcome advance for online safety
Clare Daly, child law solicitor with Comyn Kelleher Tobin (CKT), discusses Ireland’s evolving online safety legislation and the decision from the expert review group that an independent complaints mechanism within the new regulator, Coimisiún na Meán, is “feasible”.
An inquest has commenced in the UK into the death of a 14-year-old child. It is alleged that Molly died by suicide after viewing graphic online content and the inquest takes place against a backdrop of stalled legislation in the UK to tackle online safety. Sadly, such incidents are becoming all too common.
Globally, there has been an increased appetite by law makers both in terms of the protection of children’s data and also as regards the protection of children’s safety online. This week saw ground-breaking legislation in the US, where the Californian law makers signed the California Age-Appropriate Design Code, modelled on a similar code in the UK. The US law includes content-based regulation of ‘harmful’ and detrimental’ materials for children.
The increased propagation of online safety legislation highlights a broader movement towards combating online safety concerns and as a response to the risks all users face online, from longstanding social issues such as bullying to contemporary harms such as image-based abuse.
Radical overhaul of online safety legislation across the globe appears to focus on a common theme — the imposition of responsibility and accountability for permitting harmful content to remain on platforms by online service providers. There is a growing trend towards obliging online services to respond to complaints in a robust and timely manner.
The necessity for an individual complaints mechanism is very apparent. While ‘report buttons’ are available to users, the time-frames in which social media companies respond to complaints is unclear. While online services proffer to remove content that offends internal community standards and rules, these often do not translate where an individual is being targeted or harassed online or indeed where posts might amount to criminal content and avoid falling foul of these internal community standards.
Social media companies cannot reveal the identity of an anonymous poster without a court order. If the content is posted anonymously, a High Court application is required to unveil the IP address of the errant anonymous poster, which comes with significant costs and delay to a victim. Meanwhile, the individual, often a child, is left without recourse to seek removal of the post.
Online Safety and Media Regulation Bill
Ireland’s Online Safety and Media Regulation Bill was published earlier this year after many ‘false-starts’ throughout the last number of years. The bill proposes, inter alia, to establish a regulatory framework for online safety, providing a risk-based definition for ‘harmful online content’ and essentially placing onus and responsibility on ‘designated’ online services to tackle the issue of harmful online content via a series of binding Online Safety Codes.
Harmful content is defined as to include criminal offences. Non-criminal material such as cyberbullying material, promotion of self-harm or suicide, or pro-eating disorder content, which gives rise to a risk to a person’s life or a reasonably foreseeable-risk to a person’s mental or physical health, could also amount to harmful content.
The new regulatory framework for online safety, via the establishment of new independent regulator, Coimisiún na Meán, has been heralded as a radical overhaul of Ireland’s online safety regime. Within this new framework, the appointment of an Online Safety Commissioner, similar to the eCommissioner in Australia, is proposed.
The bill seeks to regulate ‘relevant’ online services; those facilitating access to ‘user-generated’ content, and could include social media companies, messaging services and gaming services. A ‘relevant’ online service may be deemed a ‘designated’ online service under the bill following a risk assessment by the new regulator.
The enforcement provisions within this proposed legislation include a modern suite of regulatory powers, such as the creation of legally binding Online Safety Codes. The Online Safety Codes will provide for a number of key areas, including content moderation and complaints handling. The bill arms Coimisiún na Meán with powerful sanctions in this aspect of the legislation, including the ability to levy fines on ‘designated’ online services of up to 10 per cent of annual turnover or up to €20 million for failure to abide by these Online Safety Codes.
Independent complaints mechanism (ICM)
However, despite these radical reforms, a major component of the bill — the establishment of an Independent Complaints Mechanism (ICM) — was missing from the initial draft bill in early 2022.
The ICM was the cornerstone of the Law Reform Commission’s Report on Harmful Communications and Digital Safety in 2016 which formed the impetus for this legislation. The 2016 report highlighted gaps that had evolved in the law through increased digitalisation of Irish society, which left online users exposed and without recourse to legal remedy in many scenarios.
In early 2022, on publication of this bill, the minister established an expert working group to decide upon a number of technical issues, including whether the provision of an ICM in the legislation would be viable. This expert group have now concluded that a mechanism allowing an individual to submit a complaint directly to Coimisiún na Meán is feasible.
It is envisaged that the mechanism will be operating by 2024, and meanwhile it is recommended that Coimisiún na Meán should introduce the ICM on a phased basis, prioritising those complaints where the online content relates to a child.
Complainants will be obliged to exhaust the internal complaints procedures of the online service before seeking recourse of the remedies afforded under the proposed legislation.
The minister is quoted as saying that where complaints are upheld it could result in a take down process, or content limitation, “and if that is not done, then it could be viewed as a criminal offence”. It appears the new regulator’s capacity to fine online services for failures in this area will not extend to this aspect of the legislation.
However, clearly the capacity to impose significant sanctions on online services, for systemic failures to respond to users complaints overall, is within the competencies of Coimisiún na Meán, as complaints response is specifically to be provided for within the binding Online Safety Codes.
The enforceability of the new regulator and the time-frames and processes by which this mechanism will operate are unclear. However, the regime represents a hugely positive step forward in terms of online safety, most particularly in terms of the provision of a remedy for the vast numbers of child users online. While the internet was designed for adults, the presence of children online is undeniable. The recent Cyber Safe Kids Annual Report shows that:
- over 95 per cent of Irish children aged 8-12 own their own smart device, and
- 87 per cent of 8-12 year old children surveyed use social media or messaging apps while 47 per cent of children aged between 8 and 12 are on TikTok.
This new ICM introduces an ambitious, ground-breaking remedy into the Irish legal landscape, with the hope of a viable, accessible, timely remedy for those exposed to harmful content online, in particular child users.
The new legislation represents a sea change in terms of the current online safety regime in Ireland, being one of the first economies to end an era of self-regulation by powerful online services.
Sadly, cases like Molly’s in the UK bring into sharp focus the devastating repercussions of harmful online content, which go far beyond a screen, and can have devastating consequences for children in real life.