Jump directly to the content

CIVIL liberties campaigners say Green Party leader Eamon Ryan hit the nail on the head in his resignation speech last week.

The Environment Minister cited abuse on social media and warned of the dangers to democracy in an “algorithm driven and polarising online world”.

Olga Cronin, Senior Policy Officer in Surveillance and Human Rights, laments that we had the chance to deal with the issue but we blew it
3
Olga Cronin, Senior Policy Officer in Surveillance and Human Rights, laments that we had the chance to deal with the issue but we blew itCredit: PR Handout
Eamon Ryan stepped down as Green Party leader last week
3
Eamon Ryan stepped down as Green Party leader last weekCredit: � 2024 PA Media, All Rights Reserved

The Irish Council for Civil Liberties (ICCL) says those unseen algorithms and “recommender systems” play a key role in ramping up Ireland’s combative social media exchanges.

And the council’s Olga Cronin, Senior Policy Officer in Surveillance and Human Rights, laments that we had the chance to deal with the issue but we blew it.

EAMON Ryan's concerns about the future of democracy in a world driven by polarising algorithms are well founded.

These recommender systems are designed to feed off our emotions, boost engagement, and keep eyeballs hooked.

READ MORE IN OPINION

They find emotive videos and posts and expose them to large audiences to maximise viewership.

This creates dangerous feedback loops and fuels political polarisation - all of which lead to greater profits for the platforms.

Without that artificial amplification, what would be considered dangerous or inciteful material from a small core group would actually not be widely seen.

Manufactured virality is what’s fuelling hate and hysteria.

It’s fair to say recommender systems and algorithms may even play a role in influencing how people vote.

Convergent Outsourcing Data Breach Settlement

Since at least as early as 2016, digital platforms have understood that their recommender systems are amplifying this toxicity.

Internal Meta research from that year, later disclosed by whistle-blower Frances Haugen, concluded that: “64 per cent of all extremist group joins are due to our recommendation tools. Our recommendation systems grow the problem.”

The researchers concluded: “Our algorithms ex- ploit the human brain’s attraction to divisiveness.”

In 2019 an internal Meta document discussed “hate speech, divisive political speech, and misinformation” and noted that “the mechanics of our platform are not neutral”.

Again in 2019 an internal Meta document concluded that content moderation is impossible at large scale, and the focus should be on avoiding algorithmic amplification.

It said: “We are never going to remove everything harmful from a communications me- dium used by so many, but we can at least  stop magnifying harmful content by giving it unnatural distribution.”

So what should we do about it? It’s ICCL’s position that acting against Big Tech’s business model of algorithmic amplification, rather than attempting to identify and remove harmful content in a whack-a-mole kind of manner, is likely to be more effective. It simply avoids intrusion upon the right to freedom of expression.

The truth is that algorithmic recommender systems are neither legally nor technically essential components of digital platforms.

And yet they are currently engaging a user’s politics, sexuality, religion, ethnicity, and health.

This should not be happening, not least because digital platforms are required by Article 9 of the General Data Protection Regulation to obtain a person’s “explicit consent” to process personal data.

It is for these reasons that ICCL believes systems feeding off such data be turned off by default.

There are numerous examples of the roles these systems have played in bullying, in promoting male supremacist influencers and even self-harm and suicide related content here in Ireland.

But we have missed an opportunity. ICCL, together with more than 60 other organisations across Ireland, previously wrote to Coimisiún na Meán/the Media Commission to highlight the need for robust measures to address the harms caused by recommender system algorithms.

An initial draft code that followed was commended by ICCL for including measures that would ensure recommender algorithms based on profiling would be turned off by default.

This was a world-leading decision that would have gone to the heart of many issues we’re seeing today with Big Tech media.

But these measures didn’t survive the process and were last month removed from the Online Safety Code. This was a disappointment. It was a significant missed opportunity for Ireland to lead.

We were about to take a stand against Big Tech’s surveillance machine, one that profits from injecting toxic content into social media.

It’s because Ireland didn’t take that stand that Eamon Ryan’s concerns remain unresolved.

Eamon Ryan's concerns about the future of democracy in a world driven by polarising algorithms are well founded
3
Eamon Ryan's concerns about the future of democracy in a world driven by polarising algorithms are well foundedCredit: Getty Images - Getty
Topics