Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Europol’s declaration against end-to-end encryption reignites debate, sparks privacy concerns

6 months ago 25

Europol’s recent joint declaration with European police chiefs urges action against end-to-end encryption, citing concerns of possible justice obstruction, amid an ongoing debate about balancing data privacy with combating crime.

At the end of April, Europol published a joint declaration with the European police chiefs calling for industry and governments to take action against end-to-end encryption roll-out, saying that this technology will stop law enforcement from obtaining and using evidence against criminals.

End-to-end encryption (E2EE) ensures that only the sender and the receiver of a message can read it, keeping it private even from the platform or service provider sending the communication, such as WhatsApp or Signal.

In the document, European police chiefs stress the importance of cooperation between law enforcement and technology companies to ensure public safety, particularly in combating crimes like terrorism and child sexual abuse.

They expressed concern that the implementation of end-to-end encryption could hinder their ability to access crucial data and identify illegal activity and advocated for a balanced approach that prioritises cybersecurity.

Their statement also called on democratic governments to establish frameworks providing necessary information for public safety.

Combating child sexual abuse

Encryption has also been at the heart of the controversy of an EU draft law aiming to detect and remove online child sexual abuse material (CSAM), with some agreeing with Europol’s views and others seeing encryption as a measure to support data privacy.

“One of the guiding principles of the Regulation is technological neutrality, thus the Regulation does not prohibit or prefer any specific tool or technology for the providers to fulfil their obligations under the Regulation, as long as these technologies and tools comply with certain safeguards,” Javier Zarzalejos, the European Parliament’s rapporteur for the CSAM file, told Euractiv last September.

Susie Hargreaves, CEO of the UK’s Internet Watch Foundation, a hotline service for people to report potentially criminal CSAM, told Euractiv that “criminals must not be given a safe place to sexually abuse children and research suggests companies can prevent the spread of child sexual abuse in end-to-end environments without compromising privacy”.

“In August 2021, one of the largest technology companies in the world, Apple, said it was possible to detect child sexual abuse material in a privacy-preserving way,” she added.

The bad guys

While this is true, Apple announced its decision to kill the photo-scanning tool at the end of 2022. Then, last September, the US tech giant cited privacy concerns to refuse detection of CSAM.

At the time, Erik Neuenschwander, director of user privacy and child safety at Apple, wrote in a letter, published by Wired, that scanning for one type of content “opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories”.

Chloé Berthélémy, senior policy adviser at EDRi, the European Digital Rights, a nonprofit advocating for digital rights, told Euractiv it was “impossible to guarantee that encryption backdoors will be exploited solely by legitimate authorities, for legitimate reasons”, echoing Neuenschwander’s comment from last year.

Carmela Troncoso, a telecommunication engineer and researcher specialising in privacy issues, told Euractiv that we currently lack tools “to only remove encryption ‘for the bad guys'”.

“Undermining encryption is putting the whole society at risk, including those they claim to need the most protection. If the rollout of encryption is stopped, criminals will move their own platforms and develop their own protections, as they do now. Only the greater public will be left to be observed, with the bad consequences this has,” she explained.

Will Cathcart, head of WhatsApp, also pointed out the data protection side of the technology in an article for The Economist in December, since one of the latest companies to introduce the E2EE is Meta, WhatsApp’s owner.

While WhatsApp already used the technology, it was only later introduced to Facebook Messenger and Instagram Direct Messages.

Yet, in a post from 2018 by Gail Kent, global public policy lead on security at Meta, highlighted the benefits of E2EE while acknowledging challenges for law enforcement.

Kent stressed the impracticality of backdoors and advocated for collaboration with governments within legal boundaries, concluding with a call for user education on encryption’s strengths and limitations.

The controversy

Emily Slifer, director of policy at Thorn, an NGO focused on using technology to combat child sexual abuse, told Euractiv last September we need to “stop pitting user privacy and child safety against each other because, with tools like the ones created by Thorn and others alongside an adequate framework with robust safeguards, we can have both”.

Thorn has its own software, Safer, to detect CSAM.

These tools, she added, “have been reliably used for years” and are “constantly being improved, getting better day by day”.

Detecting CSAM is indeed already happening, on a voluntary basis.

In January, European governments submitted a document illustrating how they are implementing the temporary regulation aiming to prevent CSAM, detailing how the bill is already being used to catch suspected content.

The cases, originating from various sources including Europol and online service providers, showcased successful detection and prosecution efforts across EU member states, aided by cooperation with international organisations like the National Center for Missing & Exploited Children (NCMEC).

The interim regulation was meant as a temporary solution with the view of adopting a permanent law to fight CSAM.

Following successive delays in adopting the permanent regulation, the Commission, the Parliament, and the Permanent Representatives Committee (COREPER) each proposed different extensions for interim rules.

Finally, in April, the European Parliament supported extending an exception to EU privacy regulations until 3 April 2026.

Euractiv reached out to the Commission for a comment about the joint declaration but did not receive a response by the time of publication.

[Edited by Zoran Radosavljevic]

Read more with Euractiv

Read Entire Article