Europe Россия Внешние малые острова США Китай Объединённые Арабские Эмираты Корея Индия

AI Act: MEPs mull narrow facial recognition technology uses in exchange for other bans

1 year ago 41

The European Parliament might be on the verge of agreeing to some narrow conditions for using remote biometric identification technologies in real-time as part of a package deal extending the list of prohibited practices.

Remote biometric identification (RBI) has been a critical contention point in the negotiations of the AI Act, a draft EU law intended to regulate Artificial Intelligence systems based on their potential to cause harm.

The AI draft law is at the last phase of the EU legislative process, so-called trilogues, whereby the EU Parliament, Council and Commission hash out the final provisions.

On Friday (3 November), the offices of the European Parliament’s co-rapporteurs Dragoș Tudorache and Brando Benifei circulated a compromise text dropping the complete ban on real-time RBI in exchange for concessions on other parts of the file.

A similar version of the package deal was circulated by the Spanish presidency of the EU Council of Ministers on Sunday. Euractiv has seen both documents but retained some details to not compromise its sources.

Remote biometric identification

In the original proposal, the Commission suggested allowing RBI technology in real-time only in specific cases, such as tracking down a missing person, preventing a terrorist attack or locating the suspect of a serious crime.

In the European Parliament, a cross-party majority voted in favour of a total ban on using these systems in real-time due to fear of mass surveillance. By contrast, EU governments were adamant in leaving some leeway for law enforcement to use this technology.

However, in the latest compromise, the exceptional circumstances are back with some modifications. For tracking down a suspect, the text now specifies that the criminal offence must fall under a new list and be punishable for a maximum period of at least five years.

These severe offences are listed in a new annexe and include terrorism, human, drugs and weapons trafficking, child sexual exploitation, murder, kidnapping, crimes covered under the International Criminal Court, hostage-taking and rape.

Additionally, law enforcement agencies can apply the authorised uses of real-time RBI only if they register the system in the EU public database and have completed a fundamental rights impact assessment.

A judicial authority should usually validate the real-time usage of RBI systems. However, in exceptional cases, the authorisation can be asked ex-post within 48 hours.

Both the Parliament’s and Council’s texts mention a role for national authorities to oversee the usage of RBI by law enforcement authorities, with the Commission empowered to launch infringement procedures against EU countries where the dispositions are not respected.

MEPs seal the deal on Artificial Intelligence Act

Following months of intense negotiations, members of the European Parliament (MEPs) have bridged their difference and reached a provisional political deal on the world’s first Artificial Intelligence rulebook.

Prohibited practices

In exchange for the concessions on remote biometric identification, the EU Parliament obtained an extension of the list of banned AI applications.

Following the notorious ClearviewAI example, the wording was added to prohibit “AI systems that create or expand facial recognition databases through the untargeted and wide-scale scraping of facial images from the internet or CCTV footage.”

Similarly, MEPs introduced a ban on systems using biometric categorisation technologies to infer sensitive information about people, like political orientation or sexual preferences. Emotion recognition technology is also forbidden in the workplace and educational areas.

Diverging views remain on using AI for predictive policing, which the MEPs want to see banned, but the member states want to keep it as a high-risk application.

Law enforcement exemptions

The EU Council introduced several significant exemptions for law enforcement agencies.

Users must monitor the operation of high-risk AI models and inform the distributors whenever they identify a severe incident. EU countries introduced wording stating that this obligation does not cover sensitive operational data from law enforcement authorities, a specification that seems to have been accepted by the Parliament.

Moreover, public bodies should not use high-risk systems not in the EU database. In this case, the compromise seems to be to delete the exemption for law enforcement and border control authorities.

Parliamentarians also appear to have accepted that law enforcement and civil protection agencies can use a high-risk AI application even when it has not undergone the conformity assessment yet in exceptional circumstances.

Provisions aimed at preventing the disclosure of sensitive operational data also seem agreed, for instance, regarding the amount of information that needs to be provided to the EU database and the probes of market surveillance authorities.

National security exemption

Under pressure from France, the Council adopted an extensive exemption from the scope of the AI Act for systems used or made available by any entity concerning military, defence or national security.

In a non-paper from last month, the Commission stated that this wording goes against the EU Treaty. At the time, the EU executive also proposed a compromise text, largely retaken in the latest document, but with two significant modifications.

The sentence: “This Regulation shall not apply to AI systems developed or used exclusively for military purposes” was added.

At the same time, the wording stating that “this Regulation is without prejudice to the competences of the member states with regard to their activities concerning military, defence, or national security” was maintained. Still, the reference that this must be in line with EU law was removed.

Alternative wording might be sought in line with the national security exemption of the Data Act.

[Edited by Nathalie Weatherald]

Read more with EURACTIV

Read Entire Article