The European Commission is pushing for better enforcement tools for the Digital Services Act (DSA), the EU’s landmark content moderation law, in two documents released on Monday (29 July), including provisions on protection of minors, regulation of influencers, and addictive design.
The Commission’s Directorate General for Communication Networks, Content and Technology (DG CNECT) is looking for companies to build enforcement and monitoring tools in a €12 million tender published on Monday.
Separately, DG CNECT released a report looking back on disinformation during June’s EU elections, calling for the voluntary code of practice on disinformation to become a formal Code of Conduct under the DSA.
The report was written by the the European Board for Digital Services, a body of national content regulators, chaired by the European Commission.
The tender
Under the 36-month contract, the company that wins the DG CNECT tender will set up an early warning system tool that will monitor in real-time technological developments and the emergence of new systemic risks or digital threats stemming from platforms.
The contractor is also to create tools aimed at monitoring and preventing recurring risks that do not fall under the first lot and monitor compliance of online marketplaces and “online advertising provisions.”
“The contractor will be required to […] carry out [an analysis on] the risk of addiction or compulsive use to these social media services,” said the technical specifications of the Commission’s tender.
The contractor is also to analyse “the role of influencers in the sale and advertisement of illegal products” and examine “different components of the experience that minors have online”.
The protection of minors online, as well as addictive design, are top-of-mind for regulators and EU lawmakers. A Digital Fairness Fitness Check, expected in 2024, will review three consumer protection directives to see if new regulation is needed.
Protection of minors and addictive design, both of which were mentioned in re-elected Commission President Ursula von der Leyen’s priorities, are expected to be part of the fairness check.
Commission’s report
According to the conclusions of the post-election report, “the Commission and the [European] Board [for Digital Services] urge signatories to proceed swiftly with requesting the conversion of the Code of Practice on Disinformation into a Code of Conduct under the DSA.”
Established in 2018, the Code of Practice on Disinformation is a tool for self-regulation, setting industry standards in the EU with supervision by the Commission.
The Code’s signatories include Meta, Microsoft, Google, TikTok, and Twitch, as well as organisations like the World Federation of Advertisers or fact-checkers such as Les Surligneurs.
The Commission has discussed integrating the revised version of the Code into the broader regulatory content moderation framework since March.
For now the Code of Practice is a non-binding voluntary agreement, but once turned into a Code of Conduct, the measures set in the Code could lead to platform audits to determine how companies are implementing the Code.
Moreover, if a company is sanctioned by the Commission, the fact that it is not a member of the Code or does not follow the Code guidelines could be aggravating factors.
Elon Musk’s X left the Code of Practice in May 2023. The Commission found the company in breach of DSA rules in preliminary findings released in early July.
[Edited by Eliza Gkritsi/Zoran Radosavljevic]