25th Feb 2024


The AI Act — a breach of EU fundamental rights charter?

  • The use of particularly intrusive AI systems, such as the ones that (claim to) infer the emotions of persons from their biometric data, affects the fundamental right to privacy, autonomy and dignity of the person concerned (Photo: Hitesh Choudhary)
Listen to article

The AI Act, which is now set to be finally adopted by MEPs in April, provides some specific rules for the use of emotion recognition systems (ERS) for law enforcement. For instance, police authorities deploying ERS are not required to inform people when they are exposed to these systems.

The use of AI systems that claim to infer emotions from biometrics (such as face, and voice) is only prohibited "in the areas of workplace and education institutions" (subject to an unclear 'safety' exception), not in contexts such as law enforcement and migration.

Read and decide

Join EUobserver today

Get the EU news that really matters

Instant access to all articles — and 20 years of archives. 14-day free trial.

... or subscribe as a group

The scientific validity of ERS raises serious concerns, notably since the expression of emotions varies considerably across cultures and situations, and even within a single person, thus not only being inaccurate but also inherently discriminatory.

The scientific basis of facial-emotion recognition systems has been called into question, by equating their assumptions with pseudo-scientific theories, such as phrenology or physiognomy.

It is about systems such as IBorderCtrl, where a virtual policeman uses a webcam to scan your face and eye movements for signs of lying. At the end of the interview, the system provides you with a QR code that you have to show to a guard when you arrive at the border. The guard scans the code using a handheld tablet device, takes your fingerprints, and reviews the facial image captured by the avatar to check if it corresponds with your passport. The guard's tablet displays a score out of 100, telling him whether the machine has judged you to be truthful or not.

In addition to the 'snake-oil AI' issue, there is more:

First, the intrusive nature of these systems will certainly increase the imbalance of power between the person concerned and the public authority.

Second, the possible 'black box' nature of the AI system: once made subject to ERS, also 'just' to assist the decision-making, the action of the law enforcement or migration authorities becomes 'unpredictable': what will be the impact on my reliability score of any of my voluntary or involuntary micro-expressions?

The EU Court of Justice previously argued that AI that produces adverse outcomes, which are not traceable nor contestable, are incompatible with the protection of fundamental rights. And fundamental rights cannot limited by an AI which is neither fully understandable nor fully controlled in its learning.

Third, the use of such AI entails the objectification of the person and a systemic disregard for bodily sovereignty. The use of particularly intrusive AI systems, such as the ones that (claim to) infer the emotions of persons from their biometric data, affects the fundamental right to privacy, autonomy and dignity of the person concerned.


The concept of dignity featured prominently in the rejection by the top court of certain examination methods in the context of migration that intrude into the personal sphere of asylum seekers. In a previous case, the court ruled that sexual orientation 'tests', conducted by national authorities in the assessment of fear of persecution on grounds of sexual orientation, would by their nature infringe human dignity.

In my view, this would be the case, and even more so, of ERS AI for the assessment of an applicant for asylum.

Notwithstanding the unacceptable risks for the rights and freedoms of the person affected by the use of these systems, notably by public authorities, ERS and other AI 'tools' such as the polygraph (the lie-detector), are classified as high-risk AI.

But these systems would be 'CE-marked' and enjoy free movement within the internal market.

The fundamental rights impact assessment, to be done by the users of the AI before its deployment, can hardly be considered a satisfactory remedy.

First, it would not solve the 'foreseeability' issue (a person may or may not ultimately be subject to ERS AI, depending on the outcome of the 'case by case' assessment). Second, and most importantly, the issue at stake is not about 'risks'. The harm to mental integrity suffered by the asylum seeker is not a risk: it is a certainty.

The right to be free from statistical inferences on our state of mind is indeed a right, and not a matter of circumstances. Morphing rights into risks is the unprecedented regulatory shift of the AI Act, which in many cases in my view does not stand the test of legality.

The risk-regulation approach can be at odds with the 'right to have rights'. The harm to dignity stemming from physiognomic AI — which is per se unfair and deceptive — is not a matter of procedural safeguards, such as 'human in the loop' or 'notice and consent', or of technical fixes.

If we consider that in most cases the requirements for the management of risks are entrusted to technical standards — entering areas of public policy, such as fundamental rights — , the shaky legal foundations of the AI Act become even more apparent.

In advocating for an AI regulation that puts human rights before other considerations, I focussed on the 'datafication' of the human body. Differential treatment of persons because of inferences ('biometric categorisations', 'recognised' thoughts, inclinations, beliefs, intentions) from the body (face, voice, the way we walk) made by machine: it is not just a matter of 'snake oil' AI, it is a line no good society should cross.

Other aspects of the AI Act, namely the regulation of the use by police of real-time and retrospective facial recognition of persons in publicly accessible spaces; the 'categorising of biometric data in the area of law enforcement'; and predictive police based on profiling, might also be assessed by the EU court as lacking the requirements of foreseeability, necessity and proportionality.

For these reasons, I hope MEPs will not approve the AI Act in its current text. The EU is a guiding light on fundamental rights in a challenging world.

The normalisation of arbitrary 'algorithmic' intrusions on our inner life via our physical appearance (be it at job interviews, when walking the street, or via chat or video bots at the borders) provides a legacy of disregard for human dignity.

The decision-making based on our putative emotions inferred by the AI from face or voice ('what the body would say') threatens our fundamental rights' heritage: the dream of our parents, which became somehow reality after the Second World War, and that we should carefully entrust to our children.

The trilogue agreement on the AI Act will be voted by the parliament's internal market committee on Tuesday (13 February) and by the plenary in April as final steps to adopt the text.

Author bio

This op-ed is written by an EU civil servant who wishes to remain anonymous to avoid any attribution of this personal opinion to his or her institution. Their real identity is known to the editors of EUobserver.


The views expressed in this opinion piece are the author's, not those of EUobserver.


The end of street anonymity — is Europe ready for that?

The EU's AI Act, despite setting global standards, fails to ban facial recognition in public spaces — setting a concerning precedent for many. Is Europe really ready to sacrifice street anonymity for enhanced security?

Big Tech's attempt to water down the EU AI act revealed

The launch of ChatGPT has sparked a worldwide debate on Artificial Intelligence systems. Amidst Big Tech's proclamations that these AI systems will revolutionise our daily lives, the companies are engaged in a fierce lobbying battle to water-down regulations.

After two years of war, time to hit Putin's LNG exports

Two years of tragedies, with well over 100,000 Russian war crimes now registered, underscore the urgent need to stop international LNG investments in Russia that continue to fund Vladimir Putin's war chest.

Ukraine refugees want to return home — but how?

Fewer than one-in-ten Ukrainian refugees intend to settle permanently outside Ukraine, according to new research by the associate director of research and the director of gender and economic inclusion at the European Bank of Reconstruction and Development.

Latest News

  1. EU rewards Tusk's Poland on rule of law with €137bn
  2. UK-EU relations defrosting ahead of near-certain Labour win
  3. EU paid Russia €420-per-capita for fossil fuels since war began
  4. After two years of war, time to hit Putin's LNG exports
  5. Creating the conditions for just peace in Ukraine
  6. Energy and minerals disputes overshadow new EU-ACP pact
  7. Germany speeds up Georgia and Morocco asylum returns
  8. How Amazon lobbyists could be banned from EU Parliament

Stakeholders' Highlights

  1. Nordic Council of MinistersJoin the Nordic Food Systems Takeover at COP28
  2. Nordic Council of MinistersHow women and men are affected differently by climate policy
  3. Nordic Council of MinistersArtist Jessie Kleemann at Nordic pavilion during UN climate summit COP28
  4. Nordic Council of MinistersCOP28: Gathering Nordic and global experts to put food and health on the agenda
  5. Friedrich Naumann FoundationPoems of Liberty – Call for Submission “Human Rights in Inhume War”: 250€ honorary fee for selected poems
  6. World BankWorld Bank report: How to create a future where the rewards of technology benefit all levels of society?

Stakeholders' Highlights

  1. Georgia Ministry of Foreign AffairsThis autumn Europalia arts festival is all about GEORGIA!
  2. UNOPSFostering health system resilience in fragile and conflict-affected countries
  3. European Citizen's InitiativeThe European Commission launches the ‘ImagineEU’ competition for secondary school students in the EU.
  4. Nordic Council of MinistersThe Nordic Region is stepping up its efforts to reduce food waste
  5. UNOPSUNOPS begins works under EU-funded project to repair schools in Ukraine
  6. Georgia Ministry of Foreign AffairsGeorgia effectively prevents sanctions evasion against Russia – confirm EU, UK, USA

Join EUobserver

EU news that matters

Join us