Ad
A screenshot of the ad DG HOME did not want to show X users in certain sensitive ad segments

‘Nazi’, ‘kosher’, ‘Green party’: how EU institutions and firms misused ad-targeting on X

Free Article

What do “Marine le Pen”, “Nazi”, “FEMYSO”, “#EvaKaili”, “communism,” “Viktor Orban” and “Christianity” have in common? 

At first glance, not much, which makes it all the more puzzling why X users who’ve shown an interest in these topics or figures were purposefully excluded from seeing ads run by the Directorate-General for Migration and Home Affairs (DG HOME) at the European Commission between September 2023 and May 2025. 

This is one of many findings in a new report by AI Forensics, an organisation that describes themselves as “digital detectives who shine a light on hidden algorithmic injustices, and work to bring accountability and transparency to the tech industry.”

The report, titled When Personal becomes Profitable, was published on Wednesday (18 June), and examined publicly available data — mandatorily provided by X under the Digital Services Act (DSA) – on ad targeting from dozens of organisations.

Paul Bouchaud, the author and researcher of the report, told EUobserver that they got interested in looking into this topic when they spotted that on X, they were being targeted by ads based on criteria that, under the EU’s privacy regulation, are known as ‘sensitive categories’.

Under the General Data Protection Regulation (GDPR), these data are specified as “special categories of personal data”, and include personal data like health, racial or ethnic origin, sexual orientation, gender, but also political affiliation and religious or philosophical affiliation or beliefs.

GDPR categorically forbids the targeting of online users based on these types of personal data. And so does the DSA.

“‘Special category’ data enjoy particularly strong protections in the GDPR because they relate to the intimate life of a person. Companies are not permitted to use these data by default,” concurs Dr Johnny Ryan, senior fellow at the Irish Council for Civil Liberties and an expert in digital surveillance, data rights, competition anti-trust, and privacy.

And in December of 2024, the European Data Protection Supervisor issued a decision against the European Commission for illegally targeting advertising at citizens using "sensitive" personal data on their political views — the same campaign for the controversial chat control regulation that DG HOME was advertising.

And yet the report found that not only DG HOME excluded users that were segmented according to political and religious interest, but also found examples of the European Health and Digital Executive Agency running ads for Horizon Europe in France that excluded politically sensitive segments such as "fascist", "ultra nationalist" and "communist". They also excluded, for whatever reason, the segment "mass murder".

It also found evidence that companies across Europe were excluding user segments. TotalEnergies for example excluded showing ads to French users with an interest in green party politicians, but also who had been segmented as interested in "kosher".

Dell Technologies in Germany chose to exclude showing ads to users segmented as interested in "Nazismus" [Nazism] but also ‘#lesbisch’ [lesbian]. Pharma company Merck Healthcare did not want their ads shown to segments labeled as "Jesus", "church" and "bible".

In all, the report covers around 30 organisations.

Tip of the iceberg

However, Bouchaud said they would have liked to do 10,000, but because X’s ads repository is so buggy so as to work intermittently, they had to settle for less. “I built something that tried automatically, a small script that just tried until one worked. That's why there are only 30 brands in the report and not 10,000. There's a huge limitation on the website.”

X, for its part, writes on its ads help center that “when you use X to follow, post, search, view, or interact with posts or X accounts, we may use these actions to customise X Ads for you.” 

Bouchaud explains that X then uses this information to offer advertisers an extensive menu of keywords to choose from that you can either target an ad towards, or exclude from showing your advertising. 

They also said that X offers ways to broadly exclude ads from appearing around controversial content or conversation, so excluding specific categories is not necessary to ensure what advertisers call "brand safety". In other words, “the argument saying we need to be targeting could be valid, but you can do it in a more inclusive way” Bouchaud says.

In fact, X itself stipulates in its advertising guidelines that — despite offering them — “targeting customers based on sensitive categories is a violation of our Ads policies,” but also that “you are responsible for all your promoted content and targeting on X. This includes complying with applicable laws and regulations regarding online advertisements.” 

Ryan however says that he thinks “the problem is that the company in question is offering this product as a normal thing. This problem is a result of enforcement failure by X's lead data protection supervisory authority (Ireland) and the DG Just's failure to ensure that the GDPR is fully applied by Ireland. So, while it is true that advertisers who use digital services should be careful, the essential intervention is to enforce the law against the companies that offer the service so that no advertisers can fall in to this trap. X should not be processing these data at all, except in the rare circumstance where a person trusts them to do so.”

And indeed, the DSA’s Article 26(3) states that providers of online platforms “shall not present advertisements to recipients of the service based on profiling … using special categories of personal data referred to in Article 9(1)” of the GDPR. 

Bouchaud agrees. “The report has a goal to question the regulation as it is. This practice is illegal but if we collectively think it's okay, then the regulation should bend it, and if not, it should just be applied and forbidden.”

As part of the report, AI Forensics also created a website where X users can upload their data to check if they were targeted under sensitive categories, which can be found here

DG HOME did not reply to a request for comment at time of publication.

This article was updated on 20 June 9:00AM to correct Paul Bouchaud's last name and pronouns.

This year, we turn 25 and are looking for 2,500 new supporting members to take their stake in EU democracy. A functioning EU relies on a well-informed public – you.

A screenshot of the ad DG HOME did not want to show X users in certain sensitive ad segments

Tags

Author Bio

Alejandro Tauber is Publisher of EUobserver. He is Ecuadorian, German, and American, but lives in Amsterdam. His background is in tech and science reporting, and was previously editor at VICE's Motherboard and publisher of TNW.

Ad

Related articles

Ad
Ad