Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Commission probes Meta for violating EU digital rulebook in protection of minors

5 months ago 18

The European Commission initiated an investigation on Thursday (16 May) into whether Meta, Facebook and Instagram’s parent company, has violated the Digital Services Act (DSA) concerning the protection of minors.

The Commission is concerned that the two platforms “may stimulate behavioural addiction” and that age verification methods are inadequate, said Executive Vice-President Margrethe Vestager.

Facebook and Instagram were classified as Very Large Online Platforms (VLOPs) in April 2023 under the DSA, meaning they have to abide by strict rules on how to deal with illegal and harmful content.

The Commission is worried that the platforms and their algorithms could encourage addictive behaviours in children and lead them towards harmful content rabbit holes, which could end up in exposure to inappropriate material.

A Meta spokesperson told Euractiv the company has “spent a decade developing more than 50 tools and policies” to protect minors and “looks forward” to sharing details on its work with the Commission.

This is the second investigation into Meta under the DSA in recent months. In April, the Commission announced a probe into whether the company had done enough to curb disinformation on its platforms.

The protection of children online is a DSA priority, the Commissioner for Internal Market Thierry Breton has repeatedly said.

In June last year, Breton said that Meta’s voluntary code on child protection is not working, in the wake of revelations that Instagram’s algorithms had facilitated and promoted child sexual abuse material networks.

The Commissioner posted on X on Thursday that “we are not convinced that Meta has done enough to comply with the DSA obligations — to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.

These failures, if confirmed, could be deemed DSA violations.

The Commission and member states are working on guidelines on the protection of minors as well as leveraging the EU identity wallet for age verification, an official for the EU executive told a briefing on Thursday.

The announcement of formal proceedings follows an analysis of Meta’s risk assessment report from September 2023, its responses to the Commission’s inquiries, publicly available reports, and the Commission’s own examination.

Formal investigations, like the one initiated, give the Commission authority to take enforcement actions, including interim measures.

“The law gives us flexibility on the timetable of these investigations. [In the case of TikTok], we’ve been able to act quite fast when it came, for example, to the prohibition of TikTok Lite,” a Commission official said on Thursday.

Other proceedings

In February, the Commission opened formal proceedings against TikTok under the DSA, due to possible breaches in several areas, including children’s protection.

In April, the EU executive body initiated a second round of formal proceedings against TikTok, upon the launch of TikTok Lite in Spain and France, signalling intentions to suspend the app’s reward program.

Just two days later, TikTok “voluntarily” suspended the rewards functions.

In December 2023, the European Parliament adopted with a broad majority the initiative to make digital platforms less addictive at its plenary session in Strasbourg.

[Edited by Eliza Gkritsi/Zoran Radosavljevic]

Read more with Euractiv

Read Entire Article