Ad
Computer says no? The EUAI Act will close some gaps in the rules, many will remain, leaving everyday banking, insurance and investment decisions in the hands of opaque models with too little oversight (Photo: Popicinio-01)

Opinion

How AI is already hurting European consumers in retail finance

In recent months, AI has dominated headlines, with Donald Trump’s AI plan, the phased rollout of the EU’s AI Act, and a fresh wave of corporate lobbying to stall new rules.

Competitiveness, growth, red tape, risk, innovation, productivity. Buzzwords swirled around Brussels in a bid to define the future of AI regulation.

When August arrived, the political noise faded into the summer recess, but AI is continuing to shape real-world decisions, and the consequences are already here. 

AI isn't just about potential growth, innovation or future risk. In retail finance, AI risks have become a reality. The sector is among the most active adopters of AI in the EU, these systems are already deciding who gets a loan, how much people pay for insurance, and whether they have access to a bank account.

And while the AI Act will close some gaps in the rules, many will remain, leaving everyday banking, insurance and investment decisions in the hands of opaque models with too little oversight.

Account closures

Across Europe, consumers are being cut off from basic banking, not because of fraud or wrongdoing, but because algorithms flag them as risky.

Many of these closures are triggered by semi-automated anti-money laundering tools that rely on incomplete or inaccurate data.

Once flagged, customers often face account freezes with no explanation and no clear way to appeal. These closures are not trivial. Customers report being abruptly cut off, unable to receive salaries, pay rent, or access public services.

Credit scoring

Credit scoring tells a similar story.

AI systems often assess creditworthiness using large volumes of personal and behavioural data, from spending habits to digital activity, but the logic behind these assessments is opaque, and the outcomes can be deeply unfair.

A borrower might be denied a loan not because they can’t repay it, but because the AI blackbox flags their digital footprint as “risky.”

And these systems are prone to produce false positives that hit vulnerable consumers hardest, such as migrants and people with low incomes.

AI-driven profiling practices also shape decisions in insurance, where 50 percent of non-life insurers in the EU already use AI

The current rulebook is inadequate.

Layered sectoral laws predate the widespread use of AI and fail to capture the harm caused by opaque decision-making.

The EU’s flagship AI Act is meant to do what sectoral laws cannot — apply rules tailored to the specific risks and challenges posed by AI.

It rightly classifies financial service use cases like credit scoring and life insurance pricing as “high-risk”, triggering requirements for risk management, transparency, human oversight and data governance.

But even here, critical gaps remain. 

Bank account access, home and motor insurance, and retail investment advice fall outside the high-risk list, despite their role in everyday life. As AI adoption accelerates across the financial sector, more consumers will be exposed to opaque systems that make consequential decisions without explanation or recourse.

And in its current form, the AI Act leaves this reality unaddressed.

The risks of AI are not speculative. It’s not a question of whether AI causes harm, it already does.

The real question is whether the promise of 'competitiveness' will serve as justification for an incomplete rulebook, or whether the EU will designate financial services uses as high-risk under the AI Act and plug the gaps.

Consumers are already falling through.

This year, we turn 25 and are looking for 2,500 new supporting members to take their stake in EU democracy. A functioning EU relies on a well-informed public – you.

Disclaimer

The views expressed in this opinion piece are the author’s, not those of EUobserver

Author Bio

Peter Norwood is senior research and advocacy officer at Finance Watch, where he leads work on retail finance, consumer protection and financial inclusion. He previously worked at the UK Financial Conduct Authority and the European Commission. Max Kretschmer is press officer at Finance Watch

Computer says no? The EUAI Act will close some gaps in the rules, many will remain, leaving everyday banking, insurance and investment decisions in the hands of opaque models with too little oversight (Photo: Popicinio-01)

Tags

Author Bio

Peter Norwood is senior research and advocacy officer at Finance Watch, where he leads work on retail finance, consumer protection and financial inclusion. He previously worked at the UK Financial Conduct Authority and the European Commission. Max Kretschmer is press officer at Finance Watch

Ad

Related articles

Ad
Ad