The Berlin NGO Democracy Reporting International (DRI) and our legal partner Gesellschaft für Freiheitsrechte (GFF) is suing Elon Musk's X to provide us with access to the platform's publicly-available data in order to conduct research on online public discourse during the German election campaign.
As a result, within just three days, the regional court of Berlin issued an injunction requested by DRI and GFF ordering X to comply with Article 40(12) of the Digital Services Act (DSA), which compels platforms to provide access to publicly available data to researchers "without undue delay".
X challenged the decision, and the court accepted one of its arguments, citing a potential bias in the judge who issued the initial injunction due to a previous traineeship at GFF. The court dismissed the bias claims against the other two judges.
The case is now returning to the court, which will decide on the preliminary injunction.
The ruling will arrive too late for DRI to conduct research on the German electoral discourse on X. Still, the decision will set an important precedent for European digital researchers at a time when US tech giants are pushing back against EU digital regulations.
Let’s break down what happened.
Article 40(12) of the DSA mandates that major online platforms, such as X, provide access to publicly available data to eligible researchers “without undue delay”.
Since last year, all major platforms have rolled out mechanisms to comply with Art. 40(12) of the DSA. This provision sought to formalise what platforms like Meta and Twitter had already been doing voluntarily: offering platform data to researchers through APIs in a timely, straightforward, non-bureaucratic manner.
As part of our research agenda, we gained access to TikTok’s Virtual Compute Environment (a secure data room designed for civil society organisations in lieu of its API) and Meta’s Content Library.
X, however, was not cooperative.
After an eight-month process that we started before the 2024 European Parliament elections — marked by two requests for additional clarification (which we promptly addressed), long stretches of silence, and repeated follow-ups — our request was ultimately rejected. Notably, this rejection came only after we asked for an update following four months of silence.
In January we made another attempt to apply for data, this time exclusively with the German parliamentary elections in mind, emphasising the urgency of our application given the election date of 23 February.
Despite this, we received no response.
Given the ongoing delays, lack of communication and prohibitively high costs of the X Commercial API, we decided to bring the matter to court.
The DSA was designed on a crucial premise: digital platforms wield immense power, providing the most important fora for political debate.
With access to platform data, researchers can understand trends on social media — whether misinformation spreads, how extremist content is amplified, and how algorithms shape user behaviour, many of which have an impact in the real world. Without evidence, debates are theoretical and regulation pointless.
Art. 40(12) of the DSA was not a radical new imposition — it merely formalised what was once common practice. Until just a few years ago, most major platforms routinely provided data to researchers, journalists, civil society, and even university students.
Twitter used to be among the most beloved platforms for providing easy access to large amounts of its data. This openness fueled extensive research on platform dynamics and their societal impact.
The DSA simply ensures that such access is not left to the whims of corporate goodwill but is instead enshrined in a clear, enforceable framework.
The ruling will carry significant legal implications. It will establish case law on key expressions in Article 40(12) of the DSA — terms that remain undefined but are crucial to its enforcement.
Take “undue delay”, for example. Until now, platforms have enjoyed broad discretion in determining how long they can take to respond to data access requests. This ruling may help establish a benchmark for what constitutes a reasonable timeframe. It will also confirm what type of research qualifies as identifying “systemic risks within the EU”, particularly regarding negative effects on civic discourse and democratic processes—a key eligibility requirement for researchers under Article 40(12).
If the current decision is upheld, it could establish jurisdiction for courts in the country where the researcher resides.
This would be a crucial precedent: instead of having to file lawsuits in Ireland, where X Corp's European headquarters are located and legal costs are prohibitively high, affected research institutions and individual researchers could sue platforms in their own countries. This would make enforcement more accessible and effective across the EU.
The decision may also influence the European Commission’s leverage on enforcement cases, as a judicial authority (in this case, a German Court) will have already established a benchmark for compliance. It may also clarify the position of researchers and civil society by opening a key — though challenging—pathway to accessing platform data, even when companies refuse to cooperate.
Last but not least, this case has symbolic value by setting a powerful precedent for how the EU could engage with U.S. tech giants. By affirming the enforceability of the DSA, the court could signal to other jurisdictions in the EU that robust oversight of big tech is achievable.
In X’s challenge to the ruling, X argued that our application, practically identical to that sent to Meta and TikTok, did not meet the criteria of researching “systemic risks within the EU”.
The company also issued a press release through one of its official public policy accounts, framing the case as a threat to freedom of expression. But this has nothing to do with freedom of expression — this is about platform transparency.
The final decision will take a few weeks.
Of course, meanwhile, the EU Commission itself has asked X to hand over internal documents about its algorithms, as part of an investigation into content moderation. Musk's X has been under investigation since December 2023 under the DSA over how it tackles the spread of illegal content and information manipulation.
With the German election done and dusted, in practical terms this means DRI did not have the possibility to examine the platform’s role in this crucial election. This isn’t just a setback for us — it’s a missed opportunity for the public to get further insights on how these platforms shape the information environment. And DRI is far from alone in being denied data access.
We remain committed to seeing the judicial process through, hearing X’s arguments, and respecting the final court decision.
Daniela Alvarado Rincón is policy officer for digital democracy and Ognjan Denkovski is research coordinator at the Berlin-based Democracy Reporting International NGO, where EUobserver columnist Michael Meyer-Resende is director.
Daniela Alvarado Rincón is policy officer for digital democracy and Ognjan Denkovski is research coordinator at the Berlin-based Democracy Reporting International NGO, where EUobserver columnist Michael Meyer-Resende is director.