Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Audio communications excluded in latest draft of child sexual abuse material law

3 months ago 8

A new compromise text of the draft law on online child sexual abuse material (CSAM), dated 28 May and seen by Euractiv, excludes audio communications from the scope and tries to strike a new balance between encryption and fighting CSAM.

The regulation aims to create a system for detecting and reporting online child sexual abuse material (CSAM). It faced criticism for potentially allowing judicial authorities to request the scanning of private messages on platforms like WhatsApp or Gmail.

The compromise text was sent by the Belgian Presidency of the EU Council to the Law Enforcement Working Party (LEWP), responsible for legislative and operational issues related to cross-border policing.

The Belgian presidency also sent the text for approval to the Committee of Permanent Representatives (COREPER), gathering 27 EU ambassadors, sources close to the file told Euractiv. The next LEWP meeting is scheduled for 4 June.

France might lend its support to the new compromise, potentially eliminating the blocking minority and clearing the way for progress on the file within the Council, which has been blocked by Paris and Berlin, the people familiar with the file said.

Audio excluded

The new text completely excludes audio communications from foreseen detection orders, leaving visual content, images, videos, and URLs to be combed through.

In previous versions of the text, only real-time audio communications were excluded.

Detection orders mandate service providers to actively search for and report instances of CSAM.

Despite this limitation, grooming – manipulative and malevolent practices aimed at children – should be “identified to some extent through the detection of visual material exchanged.”

Encryption and technology

The new text says CSAM should remain detectable across all interpersonal communication services, including those with end-to-end encryption.

However, users must consent to this detection under the providers’ terms and conditions, specifically for enabling this functionality in the service. Those who do not consent can still use parts of the service that do not involve sending visual content and URLs.

End-to-end encryption (E2EE) only allows the sender and the receiver of a message to read it, keeping it private even from the platform or service provider sending the communication, such as WhatsApp or Signal.

The technology sparked debate in the context of the file, with some arguing it is vital for data privacy and others contending that it hampers CSAM detection. The document also recognises E2EE’s importance for fundamental rights and digital security but warns against its misuse as a safe haven for CSAM sharing.

According to the new document, the provider must limit certain functions of the service to prevent the transmission of visual content and URLs unless the user gives consent.

To implement the regulation, providers of interpersonal communication services are required to use technologies to detect and prevent the spread of CSAM before it is sent to the relevant authorities.

Detection orders

The new document specifies that detection orders do not apply to accounts used by the state for national security, law enforcement, or military purposes.

Member states can allow the national Coordinating Authority to issue detection orders, contingent upon prior approval from judicial or independent administrative authorities.

Competent authorities are national juridical authorities, while the Coordinating Authority in each EU country oversees risk assessments and mitigation measures, as well as efforts to detect, report, and remove CSAM.

Reporting process

Providers must flag potential CSAM without gaining access to or control over the information, retaining hit data for at least twelve months or as per the detection order, whichever is longer.

They can also only send reports to the EU Centre, a planned new centralised hub to help fight CSAM, after verifying the content as CSAM.

Moreover, personal data should be stored separately and reports sent to the EU Centre must be pseudonymised to protect individuals’ identities.

A section requiring that children are automatically notified should potential new CSAM or solicitation attempts involving them be detected, without notifying the provider, has been removed.

[Edited by Eliza Gkritsi/Zoran Radosavljevic]

Read more with Euractiv

Read Entire Article