Thursday

28th Mar 2024

Opinion

Why EU needs to be wary that AI will increase racial profiling

  • Amnesty found that 78 percent of people on the UK's Gangs Matrix in 2017 were black, despite only 29 percent of 'gang-related' crimes being committed by black people (Photo: Tony Gonzalez)

Whether it be 'predictive' algorithmic systems used in the Netherlands to profile Eastern Europeans and Roma for 'pick-pocketing', or secretive lists of suspected criminals such as the UK's Gang Matrix, there is a growing reliance on criminal risk-scoring technologies by police forces across Europe.

Police say these systems predict where and by whom crime happens.

Read and decide

Join EUobserver today

Get the EU news that really matters

Instant access to all articles — and 20 years of archives. 14-day free trial.

... or subscribe as a group

These predictions and judgments can have very real impacts on the people subject to them, increasing the frequency of interactions with police.

In some cases, these risk-scores are heavily intertwined with social welfare systems, child protection watch-lists, and the classification of certain neighbourhoods as 'ghettos'.

Central to predictive policing systems is the notion that risk and crime can be objectively and accurately forecasted. Not only is this presumption flawed, it demonstrates a growing commitment to the idea that data can and should be used to quantify, track and predict human behaviour.

The increased use of such systems is part of a growing ideology that social issues can be solved by allocating more power, resources - and now technologies - to police.

Data might be able to indicate that something may occur in the near future, but not why. As such these technologies are about controlling behaviour that has been deemed as 'unacceptable' by police.

Fundamentally determined by historical practice and socio-economic biases, the data underlying law enforcement systems is inherently racialised and classed.

The data reflects patterns of policing, not crime.

Based on police presumptions

Discriminatory outcomes from predictive policing are therefore not an accident. They are the result of a process which seeks to "optimise" existing policing practice.

For example, Amnesty found that 78 percent of people on the UK's Gangs Matrix in 2017 were black, despite only 29 percent of 'gang-related' crimes being committed by black people.

Using indicators such as previous stop and search information, association with other 'gang nominals' social media activity, and even types of music listened to, the Matrix purports to predict who will commit crimes in the future.

The entire system is based on highly discriminatory police presumptions.

These tools tend to be deployed on crimes such as burglary, theft and other 'anti-social behaviour'. Prioritising pre-emptive police intervention on activities that are constructed to be associated with working class, migrant and racialised communities hardwires value judgments about which behaviours should be prioritised by law enforcement and which should not.

In the process this further criminalises racialised and poor communities.

In another proposed legislation, the European Commission is attempting to enhance Europol's (the EU policing agency) capacity to make use of big data – both in its operations and in the context of 'research and innovation'.

As argued by Statewatch, the European Council is already heavily subscribed to the idea that police should make more use of artificial intelligence.

However, the EU using data "hoovered up from the member states" for research or operations presents a major concern of reinforcing historic patterns of racist policing as well as deepening police surveillance.

The EU's tech deterministic worldview and vested interest in AI in policing casts huge doubts on whether it will take the necessary decisions to address the harms at stake.

It raises the question – will the upcoming regulation really regulate "high risk" AI or will it enable it?

Everything will depend on the obligations towards institutions that deploy these technologies. Categorising harmful uses of AI as 'high risk' with relatively weak procedural requirements is likely to provide a clearer roadmap for developers and states to more easily deploy technologies like predictive policing rather than deter them.

As set out in its Digital Compass, the EU will be striving to meet the goal that "three out of four companies should use cloud computing services, big data and Artificial Intelligence" by 2030.

By design, the EU's approach is likely to enable a market of unjust AI.

At the recent EU Anti-racism summit, commissioner for Equality, Helena Dalli argued that the EU must strive to become an 'anti-racist' union.

In the field of digital policy, this would mean developing a regulatory model firmly based on human rights and social justice. Issues of non-discrimination and fundamental rights would have to be at the core of the approach, rather than considered after competition and industrial policy.

A truly people-centred AI regulation would take a step back and acknowledge the inherent harms AI will perpetuate if deployed for certain purposes.

As outlined by 62 human rights organisations, the EU needs to set clear limits or 'red lines' on the most harmful uses of AI, including predictive policing, biometric mass surveillance and applications that exacerbate structural discrimination.

The EU needs to break from its enabling approach on AI. It must commit to encouraging only those applications that can guarantee benefiting people, not just public authorities and companies.

Author bio

Sarah Chander is a senior policy advisor at European Digital Rights. Fieke Jansen is a fellow of the Mozilla Foundation.

Disclaimer

The views expressed in this opinion piece are the author's, not those of EUobserver.

Magazine

The challenge of artificial intelligence

The fast-growing impact of artificial intelligence will be the biggest challenge for business and consumers in Europe's single market of tomorrow.

Eight EU states miss artificial intelligence deadline

Pan-European strategy "encouraged" member states to publish national artificial intelligence strategies by mid-2019. Germany, France and the UK have already done so - others are lagging behind.

MEPs poised to vote blank cheque for Europol using AI tools

Fair Trials, EDRi and other civil society organisations are calling on MEPs to hold true to protect our fundamental rights. We urge MEPs to vote against the revision of Europol's mandate, which distinctly lacks meaningful accountability and safeguards.

EU Modernisation Fund: an open door for fossil gas in Romania

Among the largest sources of financing for energy transition of central and eastern European countries, the €60bn Modernisation Fund remains far from the public eye. And perhaps that's one reason it is often used for financing fossil gas projects.

Why UK-EU defence and security deal may be difficult

Rather than assuming a pro-European Labour government in London will automatically open doors in Brussels, the Labour party needs to consider what it may be able to offer to incentivise EU leaders to factor the UK into their defence thinking.

Column

EU's Gaza policy: boon for dictators, bad for democrats

While they woo dictators and autocrats, EU policymakers are becoming ever more estranged from the world's democrats. The real tragedy is the erosion of one of Europe's key assets: its huge reserves of soft power, writes Shada Islam.

Latest News

  1. Kenyan traders react angrily to proposed EU clothes ban
  2. Lawyer suing Frontex takes aim at 'antagonistic' judges
  3. Orban's Fidesz faces low-polling jitters ahead of EU election
  4. German bank freezes account of Jewish peace group
  5. EU Modernisation Fund: an open door for fossil gas in Romania
  6. 'Swiftly dial back' interest rates, ECB told
  7. Moscow's terror attack, security and Gaza
  8. Why UK-EU defence and security deal may be difficult

Stakeholders' Highlights

  1. Nordic Council of MinistersJoin the Nordic Food Systems Takeover at COP28
  2. Nordic Council of MinistersHow women and men are affected differently by climate policy
  3. Nordic Council of MinistersArtist Jessie Kleemann at Nordic pavilion during UN climate summit COP28
  4. Nordic Council of MinistersCOP28: Gathering Nordic and global experts to put food and health on the agenda
  5. Friedrich Naumann FoundationPoems of Liberty – Call for Submission “Human Rights in Inhume War”: 250€ honorary fee for selected poems
  6. World BankWorld Bank report: How to create a future where the rewards of technology benefit all levels of society?

Stakeholders' Highlights

  1. Georgia Ministry of Foreign AffairsThis autumn Europalia arts festival is all about GEORGIA!
  2. UNOPSFostering health system resilience in fragile and conflict-affected countries
  3. European Citizen's InitiativeThe European Commission launches the ‘ImagineEU’ competition for secondary school students in the EU.
  4. Nordic Council of MinistersThe Nordic Region is stepping up its efforts to reduce food waste
  5. UNOPSUNOPS begins works under EU-funded project to repair schools in Ukraine
  6. Georgia Ministry of Foreign AffairsGeorgia effectively prevents sanctions evasion against Russia – confirm EU, UK, USA

Join EUobserver

EU news that matters

Join us