The Irish Council for Civil Liberties sent a report to the European Commission on Thursday (14 December), asking it to follow the Irish media regulator’s steps to turn off big tech’s algorithmic recommender systems.
Recommender systems are the digital platforms’ inner workings that determine what content users see. In most cases, these systems use algorithms that tailor the user experience based on personal data.
Major social media platforms use data such as the user’s search history or past purchases, but also on location, age, and the device used to build a profile often monetised by selling advertising space to third-party advertisers.
Ireland’s new broadcasting and online regulator, Coimisiún na Meán, has published a draft binding code requiring video-sharing platforms like TikTok and YouTube to stop automatically using recommender systems based on extensive user profiling.
Moreover, these companies would have to stop building profiles of those users whose age is unconfirmed or who are children.
“Coimisiún na Meán is leading the world by forcing Big Tech to turn off its toxic algorithms. People – not Big Tech’s algorithms – should decide what they see and share online”, said Johnny Ryan, a Senior Fellow of the Irish Council for Civil Liberties (ICCL).
“The European Commission should learn from Coimisiún na Meán’s example and give everyone in Europe the freedom to decide,” he added.
The Irish regulator’s draft Online Safety Code’s main rationale is that recommender systems not only rank products but also online content. The fact that the proposal comes from Ireland is particularly relevant, as most Big Tech companies have their European headquarters there.
As the platforms profit from users’ attention via advertising, recommender systems tend to maximise engagement, which can lead to the promotion of extremist or harmful content that can lead to real-world violence, as was the case with the Dublin riots three weeks ago.
According to Amnesty International’s findings from November, when their researchers mimicked a 13-year-old user’s profile on TikTok and viewed mental health struggle-related videos, “multiple recommended videos in a single hour romanticising, normalising or encouraging suicide” appeared on their feed.
Other examples include extremist groups joining Meta platforms due to its recommender system, a report by the European Commission about recommender systems by big tech feeding into Russia’s disinformation about the Ukrainian war, or a Mozilla study about YouTube showing problematic content mostly due to its recommender system.
According to Coimisiún na Meán’s bill, video-sharing platform service providers must follow several measures when preparing a recommender system safety plan.
Measures include ensuring “that recommender algorithms based on profiling are turned off by default” and measures to ensure that “algorithms that engage explicitly or implicitly with special category data such as political views, sexuality, religion, ethnicity or health should have these aspects turned off by default.”
This is important because, while, for example, iOS users can decide which app can access their data via Apple’s App Tracking Transparency, most users rarely, if ever, change their default settings.
Coimisiún na Meán published a “Consultation on binding rules for video-sharing platforms to keep adults and children safe online” last Friday (8 December) that will run until 19 January 2024.
The Code aims to protect children online, not only by banning profiling but also, for example, by tackling cyberbullying.
Irish Online Safety Commissioner Niamh Hodnett said they “will be seeking approval from the European Commission to implement the code”. The Commission will have to review the Irish law to see if there is any incompatibility with EU legislation.
In case of breaching the law, the Irish regulator will be able to impose fines of up to €20 million after the Code is finalised and binding.
Euractiv reached out to Meta, YouTube, and TikTok for a comment. The latter declined to respond, while Meta and YouTube did not answer by the time of publication.
[Edited by Luca Bertuzzi/Zoran Radosavljevic]