Alex Cooney is co-founder and CEO of CyberSafeKids (www.cybersafekids.ie).

Online regulation: We must make sure that accountability is put in all the right places

By Alex Cooney, co-founder and CEO of CyberSafeKids

In recent months, a six-hour Facebook outage provided an interesting example of our complex relationship with social media platforms and the hugely powerful companies behind them.

Users the world over were up in arms and their frustrations over the lack of access (and what might have caused it), made global headline news. One thing was clear: we spend far more time posting, checking and scrolling on those platforms than most of us would like to admit.

Social media and gaming platforms are a vastly powerful industry and we, their users, are very invested in them, in terms of our time and attention. They are a source of entertainment, fun and are an important element of our socialising.

The Child Mind Institute’s 2019 Children’s Mental Health Report found that 81 per cent of teens surveyed said that social media makes them feel more connected to their friends. That was before Covid lockdowns significantly increased our reliance on it.

But there is a darker side to all of this, as the recent Facebook whistleblower revelations about negative impacts on children’s wellbeing highlighted.

These revelations followed the publication of joint research carried out by the 5Rights Foundation, which showed how the design of these platforms is putting children at risk. It provided evidence that design features are deliberately developed to increase time spent online, to target child accounts with harmful content (such as pornography and pro-anorexia material) and to deliver to them unsolicited messages and requests from strangers.

Who is accountable when things go wrong for a child, or indeed any user, on an online platform? Is it the parents who let their children use these services without adequate supervision and sometimes before they are of the right age? The Government? The big tech companies that own them?

Well the responsibility lies with all of the above. But one of those groups is profiting far more from children being online than the others.

Big Ttch

As it stands, all of the big tech companies have ‘community standards’ in place, which are essentially the acceptable codes of behaviour on their platforms. That’s great in practice but the policing of such standards has proven to be inconsistent and existing safeguards are possible to circumnavigate.

To my mind, this means that these companies essentially dance to their own tune. That has got to change. It’s at least two years since our own Government said that the “era of self-regulation is over” (2019) and yet, that model is the one that persists.

Change, however, is on the horizon. We have the eagerly anticipated Online Safety and Media Regulation Bill coming down the tracks. Having completed its extensive pre-legislative scrutiny, the Joint Oireachtas Committee has published its report and the findings are encouraging.

Significant changes

As members of the Children’s Rights Alliance 123 Online Safety Campaign we have been lobbying for significant changes that will serve to substantially strengthen what is essentially a weak bill. Crucially, the recommendations include the setting up of an individual complaints mechanism, which will provide an important safety net for those users who have incurred harm online, specifically in the event that the online services have failed to provide that support themselves.

As part of its online safety campaign, the Children's Rights Alliance has sought to ask the public what they expect from this upcoming Bill and the results were encouraging to those of us that want to see more responsibility placed on the shoulders of the online platforms in terms of harmful content they host.

Survey results

A thousand adults were surveyed and the vast majority (91 per cent) felt that big tech companies use their power and influence in a way that benefits themselves (and this research was gathered before the Wall Street Journal got their scoop from Frances Haughen). 69 per cent felt that they should deliberately block children from accessing content that could harm them.

72 per cent felt that penalties should be introduced that would encourage social media companies to stop harmful or illegal content on their platforms and 70 per cent felt that the Government should introduce laws that hold big tech and social media companies responsible for the content they allow on their platforms.

And with regards to that important question of whether or not the proposed Online Safety Commissioner should have the powers to investigate complaints made by the public when social media companies fail to uphold the rights of the individual, there was also clear support from the public with 77 per cent of those surveyed in favour and a further 78 per cent in favour of financial penalties for those companies that fail to comply.

As host to a number of the EMEA headquarters for larger platforms like Google, Facebook, TikTok and Twitter, Ireland should be leading the way. Regulation is coming and is clearly much needed, but we must make sure that accountability is put in all the right places because at the end of the day, only then will it become the vital safety net it needs to be.