Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

Paris Olympic Games are a test for AI video surveillance

1 month ago 16

Artificial intelligence (AI)-powered video surveillance has been authorised on an experimental basis for the Summer Olympic Games in Paris, stoking fears among organisations that defend citizens’ digital rights.

In a decree issued on Friday (19 July), the Paris police prefecture authorised the experimental use of algorithmic video surveillance (VSA in French) tools in 46 Paris subway stations, in line with the Olympic and Paralympic Games law of 19 May 2023, which sets the legal framework for the experiment until the end of March 2025.

The games in the French capital will take place from 26 July to 11 August.

“The use of VSA in 46 Paris subway stations, even those where no sporting events will be taking place, raises questions about the purpose for which they are being used,” Noémie Levain, head of legal and political analysis at the rights and freedoms association La Quadrature du Net, told Euractiv.

“Will the cameras installed at République subway station, for example, be used to monitor people going to demonstrations?” she asked.

According to the Ministry of the Interior, only certain situations, such as “crowd movements,” times when the “density of people is too high,” or when there is the “presence of abandoned objects,” will be detected by AI within public transport networks.

However, these use cases entail risks and may in time lead to abuse, as their definitions are sometimes too broad, Katia Roux, a specialist in technology and human rights at Amnesty International France, told Euractiv.

The events AI software has been trained to detect range from “the presence or use of weapons” to “fire outbreaks,” says the Interior Ministry website.

While these use cases are not problematic in themselves, others, such as “failure to follow the direction of traffic,” raise questions among experts about possible broad interpretations of this type of event.

Respecting the legal framework

Testing AI solutions is taking place “within a clear legal framework that safeguards fundamental and individual freedoms,” says the Interior Ministry’s website.

The tools deployed do not allow facial recognition, and the AI algorithms have only been trained to detect eight high-risk situations, it specifies.

The legal framework for the use of VSA was also approved by the Constitutional Council in May 2023.

The EU’s AI Act, which will apply equally throughout the EU as of 1 August, prohibits the use of real-time biometric identification, which includes facial recognition, in public places, except for certain cases of law enforcement, such as searches for missing children and imminent terrorist attacks.

However, “algorithmic video surveillance can analyse biometric data (body data, behavioural data, gait), which is protected personal data,” Roux warned.

In her view, VSA is a “highly intrusive technology which, like facial recognition, represents a threat to fundamental rights.”

The next step

The Ministry asserts that VSA makes it possible to “detect unusual situations, without ever overriding the decision [taken by humans]”.

However, “it’s a political decision to want this technology,” explained Levain.

“The aim is to validate the belief that the tool can be used while sidestepping its problems,” she continued, referring to the issues of mass surveillance, the biases of artificial intelligence algorithms and the dangers of reinforcing discrimination.

Organisations that protect fundamental freedoms online warn of a slippery slope and fear that French legislators will extend the experiment to new use cases, or perpetuate the use of VSA in France beyond March 2025. This extension was already suggested in a Senate report.

“Technologies that enable mass surveillance […] are quite simply incompatible with human rights” according to international law, said Roux.

A technology that is not yet “mature”

The tools deployed during the Olympic Games are not mature, Senator Agnès Canayer (Les Républicains, EPP) told a press conference in April about the information mission on the application of the Olympic and Paralympic Games (JOP) law

“The Olympic Games will not be the end goal of algorithmic video surveillance, but the opportunity to test the usefulness of this technology,” she said.

The JOP law originally provided for the experimental use of remote and real-time analysis of images taken by drones, but the tools deployed will only be run on images taken by fixed cameras.

[Edited by Zoran Radosavljevic]

Read more with Euractiv

Read Entire Article