Top Categories

ECJ ruling: store passenger data only to a limited extent

ECJ ruling: store passenger data only to a limited extent

Status: 06/21/2022 1:01 p.m

The European Court of Justice has limited data storage during flights. The duration is reduced, there must be a concrete suspicion – and artificial intelligence may only be used to a limited extent for evaluation.

By Gigi Deppe, ARD legal department

The judges of the highest EU court have not fundamentally stopped the storage of passenger data. This means that airlines in the EU still have to pass on a lot of information about passengers to the authorities. So they will continue to report who flew, when, how and with whom.

Significant Restrictions

However, the ECJ has limited the matter in a number of points, and practice will have to change to some extent. For example, flights within Europe may only be evaluated by the police if there is a real terrorist threat or if there are very specific indications that something serious is happening on certain flights or airports.

The judges also made it clear that the data may only be tapped by the authorities if the trip is somehow linked to criminal behavior. Be it that the trip serves to commit a serious crime, be it that the person concerned wants to fly away from the scene of the crime. For example, it would not be okay to search for the data of a criminal who just went on vacation. This means that it is also not permitted to use the data for other purposes, for example to detect illegal entries.

The Luxembourg judges limited the existing rules.

Image: dpa

Don’t just let AI browse data

There are also limitations when searching the huge amounts of data for possible, yet unknown criminals. Because this can hardly be done by hand, the police scan the stored data with the help of software: Who behaved suspiciously, perhaps took a long-haul flight with little luggage or paid for the flight in cash? The use of this software is still possible, but according to the court: It must be clearly defined in advance which features the algorithm is looking for.

It is not okay that self-learning artificial intelligence takes over, so the machine always determines new human characteristics that are considered suspicious. For the first time, the ECJ is limiting the use of artificial intelligence here – for that reason alone this judgment is significant, because it could serve as a model for future decisions on completely different issues.


The Belgian human rights organization Ligue des droits humains (Human Rights League) complained about how Belgium is implementing the rules. Among other things, she sees the right to respect for private life and the protection of personal data as being violated. (Case number C-817/19)

Six months instead of five years

What is still important to the Court of Justice with regard to passenger data: that travelers who, for example, are not even allowed to board the plane because of suspicious data, can defend themselves. A lot of mistakes happen, the judges noted. So a court must be able to review the matter. After the fact, when the trip is long over, the data can only be used if new circumstances arise and it is clear that this can be used to fight terrorism and serious crime.

Another problem for the judges was that the data was stored for so long, namely five years. Six months is fine, they say, if it hasn’t turned out in that time that the trip was related to dangerous crimes. For Germany, for example, this means that the Federal Criminal Police Office has to sift through the database in the first six months and is only allowed to keep longer that which provides objective evidence of serious crime.

Incidentally, the Court of Justice is quite clear about what is not possible at all: that the data of train or bus passengers is stored on a large scale. This is not permitted under European law.

Source by [author_name]

Leave a Reply

Your email address will not be published.