The European Data Protection Board this year published a comprehensive case digest analysing howlegitimate interest under Article 6(1)(f) of the General Data Protection Regulation has been applied - and frequently misapplied - across 62 One-Stop-Shop decisions and five EDPB binding decisions issued between December 2018 and June 2025. Authored by Dr. TJ McIntyre under the EDPB's Support Pool of Experts Programme and submitted in December 2025, the 29-page report cuts through years of regulatory decisions to surface patterns that have direct consequences for any organisation processing personal data in the European Economic Area.
The report is not a guideline or a binding instrument. It is an analysis. But its findings are uncomfortably specific, and the picture it paints is of controllers who systematically underestimate what the balancing test requires, who treat legitimate interest as a flexible fallback rather than a carefully documented legal basis, and who routinely fail at the most basic procedural level: conducting the assessment before the processing begins.
Three cumulative conditions - all three must hold
Article 6(1)(f) GDPR establishes a three-part test that controllers must satisfy in sequence. First, the controller or a third party must pursue a legitimate interest. Second, processing must be necessary to achieve that interest - and no less intrusive alternative may exist. Third, the interests or fundamental rights and freedoms of data subjects must not override the controller's interest. According to the report, this third condition - the balancing test - is the stage at which the majority of controllers in the dataset stumbled.
The report draws on EDPB Guidelines 1/2024, adopted on 8 October 2024, as the primary regulatory framework. Those guidelines clarify an important conceptual distinction that many controllers appear to overlook: the difference between a "purpose" and an "interest." According to the EDPB, a purpose is "the specific reason why the data are processed," while an interest is "the broader stake or benefit that a controller or third party may have in engaging in a specific processing activity." Conflating these two concepts was a recurring failure in the decisions reviewed.
In practice, the difference matters enormously. A controller may have an interest in promoting its products - and may advance that interest by processing personal data for direct marketing. But that formulation requires precision. Decision EDPBI:SE:OSS:D:2025:1738 illustrates what happens without it. An online media firm, relying on advice from its consent management platform provider, stated in a cookie banner that it relied on legitimate interest to process data for profiling and precise geodata of users. When the Swedish supervisory authority asked the company to specify its legitimate interest, it could not do so, and could not demonstrate any balancing test had been carried out. The LSA concluded that "a controller cannot disclaim the responsibility to ensure that there is a legal basis for the company's personal data processing by referring to a supplier's recommendations." The case is a sharp warning to the programmatic advertising industry, where reliance on third-party consent management platforms is widespread.
The IAB Europe Transparency and Consent Framework has been at the centre of related disputes for years. Decision EDPBI:BE:OSS:D:2022:325, which concerned the TCF and real-time bidding, found that the processing purposes were "described in general terms, with the result that it was not easy for users to assess to what extent the collection, dissemination and processing of their personal data are necessary for the intended purposes." Phrases such as "measure content performance" and "apply market research to generate audience insights" were found to lack the specificity required by the GDPR - providing, in the LSA's assessment, "little or no insight into the scope of the processing, the nature of the personal data processed or for how long the personal data processed will be retained."
What qualifies as 'legitimate'?
The GDPR does not define the word "legitimate," and the report traces how the Court of Justice of the European Union has filled that gap. In Case C-621/22, Koninklijke Nederlandse Lawn Tennisbond, decided in 2024, the CJEU addressed whether a national sports federation could sell personal data of members - names, addresses, telephone numbers and emails - to sponsors. The court rejected the proposition that a legitimate interest must have a positive legal basis to be "provided for by law," and accepted that a purely commercial interest could qualify. But the court also held that a legitimate interest must be "lawful" in the sense of not being "contrary to the law."
This second limb has practical teeth. The report cites the example of shadow blocking - the practice of reducing the visibility of users' posts without their knowledge. Decision EDPBI:LT:OSS:D:2024:1361 concerned an online second-hand clothing marketplace that restricted user visibility without disclosure. The LSA accepted that preventing abusive users could in principle constitute a legitimate interest. But the Digital Services Act, which came into force after the underlying events, now explicitly prohibits shadow blocking under Article 17. The report notes that this illustrates a situation where the interest pursued "would now be 'contrary to the law' and incapable of constituting a legitimate interest before even reaching the necessity and balancing tests."
The same decision was subsequently upheld by the Lithuanian Regional Administrative Court, which held that "the essence of shadow blocking, i.e. the deliberate non-disclosure of information to the user, goes contrary to the principles of GDPR, in particular the principle of lawfulness."
The range of interests that supervisory authorities have accepted as legitimate in principle is broad. GDPR Recitals 47 to 50 provide a non-exhaustive list including fraud prevention, direct marketing, intra-group data transfers, and network security. More recently, EDPB Opinion 28/2024 on AI models, adopted on 17 December 2024, stated that controllers may have a legitimate interest in developing AI systems to assist users, detect fraudulent content or behaviour, or improve threat detection in information systems. The EDPB's AI opinion was significant for the marketing technology industry because it addressed, for the first time in formal guidance, how the three-part legitimate interest test applies to AI model development - a question of immediate relevance to companies building targeting, optimisation, and measurement tools.
Legitimacy in principle, however, does not automatically translate to legitimacy in fact.
The necessity test is where many controllers fail
The report's second major finding is that even controllers who establish a legitimate interest in principle frequently fail the necessity test. The standard is demanding: the controller must demonstrate that the legitimate interests pursued cannot reasonably be achieved just as effectively by other means less restrictive of data subjects' fundamental rights.
Several decisions illustrate this in concrete terms. The EDPB's Urgent Binding Decision 01/2023 against Meta Platforms Ireland found that there are "realistic, less intrusive alternatives to online behavioural advertising, making the processing at stake not necessary." That decision, adopted on 27 October 2023, was the culmination of a lengthy enforcement sequence. The ban on Meta's behavioural advertising on the basis of legitimate interest and contract across the entire European Economic Area marked a significant turning point for the programmatic advertising industry.
Decision EDPBI:ES:OSS:D:2021:338 found that a hotel's use of guest photographs to prevent fraud was not strictly necessary, because alternative measures - checking surnames, room numbers, or requiring signatures - could achieve the same purpose. Decision EDPBI:DEBE:OSS:D:2022:477 found that forcing customers to provide a phone number for customer service was not necessary, because email was an equally effective and less intrusive alternative. In both cases, the controller's chosen technical approach determined the outcome, and in both cases, a different technical approach would have satisfied the test.
The Worldcoin case, EDPBI:DEBY:OSS:D:2024:1594, contains the report's most technically detailed necessity analysis. The Worldcoin Foundation sought to use iris scans as the basis for an internet-wide identification system and sought to retain biometric iris codes even after account closure, in part to prevent banned users from re-registering under a new identity. The Bavarian LSA accepted the principle: online services have a legitimate interest in "protecting the integrity of their online spaces." But the implementation failed the necessity test because Worldcoin's approach placed "every user under general suspicion of being blocked without the actual existence of such a block." The LSA identified a less intrusive alternative: contacting connected services to verify whether a block existed for a particular user, rather than retaining the iris codes of all users who closed their accounts. The decision attracted significant attention in Spain, where the AEPD issued a formal preventive warning to Tools for Humanity GmbH in February 2026 as the company prepared to relaunch iris-scanning activities in Barcelona.
The Worldcoin decision is notable for another reason. The LSA discussed at length the concept of a "right to lie" - the proposition that biometric identification systems deprive data subjects of the ability to conceal information in response to unjustified or illegal demands. The LSA cited the example of German labour law, under which employees have the right to lie in response to questions about pregnancy, illness, trade union membership, or religious affiliation that are unrelated to work. Biometric data, the LSA concluded, removes this option entirely and thereby affects informational self-determination under Articles 1 and 2(1) of the Basic Law of the Federal Republic of Germany.
The balancing test and reasonable expectations
The third condition - the balancing test - requires controllers to assess whether data subjects' interests, rights, and freedoms override the controller's legitimate interest. According to EDPB Guidelines 1/2024, the test must consider: the data subjects' interests, fundamental rights and freedoms; the impact of the processing; the reasonable expectations of data subjects; and the result of the final balancing, including any mitigating measures.
Reasonable expectations emerged as the most commonly cited failure mode in the balancing test across the decisions reviewed. Failure to meet transparency requirements under Articles 13 and 14 GDPR frequently resulted in a finding that data subjects could not reasonably have expected the processing in question.
Decision EDPBI:FR:OSS:D:2024:1257 involved a chain of mobile phone stores that purchased consumer contact details from data brokers to make promotional calls and send SMS messages. The French supervisory authority found that the data brokers had not indicated to data subjects at the time of collection with whom their data could be shared. As a result, "data subjects could not reasonably expect to receive commercial prospecting offers from this company," and legitimate interest failed as a legal basis.
The decision has a direct parallel to a common practice in digital advertising: the purchase of audience data from third-party data brokers for targeting purposes. The question of whether data subjects can reasonably expect their data to be used for advertising by companies with whom they have no relationship is precisely the kind of issue that supervisory authorities are applying the balancing test to examine. The Belgian Market Court's confirmation in May 2025 that IAB Europe violated multiple GDPR provisions in the operation of the TCF - including failures around legal basis and transparency - is part of the same regulatory current.
One decision cuts against the general trend. In EDPBI:SE:OSS:D:2022:506, a company forwarding order details to a third-party fraud prevention service did not disclose the specific provider in its privacy notice, referring only to unspecified "external resources." The Swedish LSA found this insufficient under Article 13(1)(e) GDPR. Yet it accepted that credit-based purchasing was a context in which data subjects could reasonably expect such processing to occur, describing the transparency failure as a "minor deficiency." This illustrates a point made in EDPB Guidelines 1/2024 that "reasonable expectations do not necessarily depend on the information provided to data subjects" - a qualification that the report notes has been applied inconsistently across Member States.
Decision EDPBI:CZ:OSS:D:2022:1278 involved an antivirus software provider that shared pseudonymised information on approximately 100 million users, including web browsing histories, with another company in its corporate group for statistical analysis and onward sale. The Czech LSA found that users could not have expected this. Users acquire antivirus software to protect their data and privacy. The controller marketed its products on these grounds. Public outcry after the data sharing emerged was itself treated as evidence that users were surprised. The case highlights the extent to which the reasonable expectations test is anchored in the controller-data subject relationship and the representations the controller has made to users - a consideration with clear implications for any data company marketing itself on privacy grounds.
Recurring themes: retroactive reliance and ePrivacy overlap
Two structural issues recur across the dataset. The first is the question of whether controllers can retroactively change their legal basis to legitimate interest when a supervisory authority rejects the original basis. The dominant position in the decisions is that they cannot. Decision EDPBI:ES:OSS:D:2021:338 states the reasoning clearly: without information about the balancing test, "the data subject is deprived of his or her right to know what those legitimate interests alleged by the controller or of a third party would justify the processing without his/her consent being taken into account."
There is an outlier. Decision EDPBI:EE:OSS:D:2025:1791 involved a ride-hailing company that had relied on Article 6(1)(b) GDPR - performance of a contract - to record driver ratings of passengers. The Estonian DPA permitted the company to change its legal basis to legitimate interest retroactively, on the basis that the prior terms had referred in a general way to legitimate interest for safety and security purposes, and because the controller reworked its practices extensively, with LSA input, to address the deficiencies. The revised system introduced a detailed explanation of the rating process, a right to challenge ratings, in-app features informing passengers of rating consequences, restrictions on which employees could view ratings, and human review of automated account suspensions. The Estonian DPA concluded that these measures were sufficient to justify the retroactive basis change.
The second structural issue is the overlap between the GDPR and the ePrivacy Directive. The ePrivacy Directive generally requires informed consent for cookie use, excluding legitimate interest as a legal basis for cookie placement. Yet the GDPR's one-stop-shop mechanism does not extend to ePrivacy enforcement, which is handled by different national regulators in many Member States. Decision EDPBI:SE:OSS:D:2025:1738 illustrates the resulting complexity: the Swedish LSA found it could not assess the legality of cookie storage because that was reserved to the telecommunications regulator, but took the view that the ePrivacy consent requirement should be factored into the subsequent GDPR balancing test. The TCF's ongoing compliance pressures - including a doubling of vendor enforcement procedures to 587 in 2025 - reflect precisely this complexity, as the framework sits at the intersection of both regimes.
Consumer finance and sector-specific patterns
Consumer finance occupied a disproportionate share of the OSS decisions, with recurring patterns around credit checks, reporting to default registries, public identification of debtors, and debt collection tactics.
Several decisions involving the online retailer Zalando established that retailers have a legitimate interest in conducting credit checks before concluding a transaction on invoice. But the decisions imposed strict conditions. According to EDPBI:DEBE:OSS:D:2024:1280, a credit check was acceptable only after a customer had "placed goods in the basket, entered his delivery and invoice address, selected in the checkout process 'Purchase on invoice' and confirmed this input by clicking on the 'further' button." Safeguards against accidental selection of a credit payment option were mandatory. Decision EDPBI:DEBE:OSS:D:2024:1279 treated a requirement to enter a social security number prior to completing a credit transaction as an appropriate safeguard.
Reporting to credit default registries was addressed in several decisions that emphasised the need for case-by-case assessment. A blanket policy of referring all unpaid debts to a credit default registry was found incompatible with Article 6(1)(f) GDPR. Public identification of debtors online produced divergent results. Decision EDPBI:CZ:OSS:D:2019:56, involving a company that published debtors' partial names and amounts owed on its website and Facebook profile, rejected the practice on necessity and proportionality grounds. "In countries where the rule of law applies," the Czech LSA stated, debt collection "must be carried out in a way foreseen by law and not by public denunciation of the debtors." By contrast, the Estonian and Polish authorities reached different conclusions on comparable facts in EDPBI:EE:OSS:D:2023:885, reflecting underlying differences in national law and practice.
Antivirus data, rental scooters, and flight tracking
The decisions span a remarkably wide range of factual contexts beyond consumer finance. In the rental vehicle sector, the French supervisory authority addressed a car rental company collecting geolocation data at 500-metre intervals whenever the engine was turned on or off, or whenever a door was opened. This data was transmitted in real time and stored for the entire duration of the commercial relationship plus three years. The LSA found this excessive. A separate decision concerning an electronic scooter rental company that collected location data from each scooter every 30 seconds, stored for 24 months, reached the same conclusion.
The scooter decisions also produced one of the more unusual findings in the dataset. Decision EDPBI:EE:OSS:D:2023:785 concerned a scooter that logged the weight of riders on each trip, sending an alert if the detected weight exceeded 1.4 times the median weight recorded for that user on previous trips. The controller relied on the legitimate interest of promoting rider safety by deterring tandem use. The Estonian LSA accepted this, finding that weight monitoring was less invasive than alternatives such as video surveillance, and noting that the alert was a warning only - it did not stop the scooter or restrict the user.
In aviation, decision EDPBI:SE:OSS:D:2025:1825 addressed Flightradar, which tracks aircraft worldwide in real time and historically. The Swedish LSA accepted that Flightradar could rely on third-party legitimate interests in monitoring global air traffic, partly because aviation industry research, media reporting, and national authority use of the data gave the service a quasi-public dimension. But the LSA declined to extend this to the fact that police have used the data for criminal investigations, citing Case C-252/21, Meta Platforms v Bundeskartellamt for the proposition that "a controller that primarily pursues an economic interest cannot, as a general rule, rely on a legitimate interest in processing personal data for the purposes of preventing, detecting or prosecuting criminal offences, when this is unrelated to its commercial activities."
What the digest means for marketing and advertising professionals
For marketing professionals operating across the EEA, the digest has several direct implications. The consistent finding that vague statements of legitimate interest - including language characteristic of standard vendor contracts and cookie banners - fail the specificity requirement means that any organisation relying on legitimate interest for advertising-related processing should audit its legitimate interests assessments against the specificity standard established by EDPB Guidelines 1/2024.
The decisions on behavioural advertising and data broker purchasing confirm that this category of processing faces a high bar under the necessity and balancing tests. The EDPB's 2024-2025 work programme had identified legitimate interest as a specific topic for further guidance, and the case digest now provides the most detailed empirical picture yet of how supervisory authorities have applied the concept in practice.
The TCF v2.3 migration completed its mandatory deadline on 1 March 2026, with Google confirming that non-compliant publishers now face ad requests defaulting to limited ads. The digest's findings on cookie processing, the ePrivacy overlap, and the risks of delegating compliance responsibility to consent management platform providers are directly relevant to publishers and vendors navigating the post-deadline landscape. The DMA-GDPR joint guidelines under consultation - with over 100 submissions published on 13 March 2026 - add a further layer, with the draft guidelines making clear that legitimate interest cannot serve as the legal basis for cross-service data combination by gatekeepers.
The digest closes with two observations that are likely to occupy regulators and practitioners for some time. One is the choice of law problem in cross-border cases: the OSS mechanism identifies the lead supervisory authority but does not prescribe which Member State's law applies when national standards on matters such as debt collection differ. The other is the practical harm caused by the split between GDPR and ePrivacy enforcement - a division the report suggests should be resolved by bringing ePrivacy enforcement within the GDPR cooperation and consistency mechanism.
Timeline
- December 2018: Earliest decisions included in the dataset are adopted.
- 9 April 2014: Article 29 Working Party Opinion 6/2014 on legitimate interest published.
- 21 February 2014: Polish Supreme Administrative Court decision on debtor data publication (ref. I OSK 2463/12), influencing later cross-border OSS decisions.
- 2 February 2022: Belgian DPA finds TCF non-compliant with Article 6 GDPR, imposing €250,000 fine on IAB Europe.
- 28 July 2022: EDPB Binding Decision 2/2022 on Meta Instagram child users published.
- 27 October 2023: EDPB Urgent Binding Decision 01/2023 instructs Irish DPA to ban Meta's behavioural advertising on basis of legitimate interest and contract across EEA.
- 8 October 2024: EDPB Guidelines 1/2024 on legitimate interest under Article 6(1)(f) GDPR adopted.
- October 2024: CJEU decides Case C-621/22, Koninklijke Nederlandse Lawn Tennisbond, accepting purely commercial interests can qualify under Article 6(1)(f) GDPR.
- 17 December 2024: EDPB Opinion 28/2024 on AI models published, identifying legitimate interests in AI development.
- 16 October 2025: Cut-off date for inclusion of decisions in the digest.
- 3 November 2025: Google mandates TCF v2.3 migration by February 2026.
- December 2025: EDPB case digest submitted under Support Pool of Experts Programme.
- 13 February 2026: AEPD issues preventive warning to Tools for Humanity ahead of Barcelona iris-scanning relaunch.
- 1 March 2026: TCF v2.3 mandatory deadline passes; Google confirms non-compliant publishers face limited ads.
- 13 March 2026: European Commission and EDPB publish over 100 submissions on draft DMA-GDPR joint guidelines.
Summary
Who: The European Data Protection Board (EDPB), through its Support Pool of Experts Programme, commissioned the report from Dr. TJ McIntyre. The decisions reviewed were issued by national supervisory authorities from across the EEA acting as lead supervisory authorities under the GDPR's One-Stop-Shop mechanism, and by the EDPB itself under Articles 65 and 66 GDPR.
What: A 29-page case digest analysing 62 OSS decisions and five EDPB binding decisions related to legitimate interest under Article 6(1)(f) GDPR, covering the three-part test, sector-specific patterns in consumer finance, anti-fraud, vehicle monitoring, and aviation data, and two novel structural issues: retroactive reliance on legitimate interest and the overlap with the ePrivacy Directive.
When: The decisions covered were adopted between December 2018 and June 2025. The cut-off date for inclusion was 16 October 2025. The report was submitted in December 2025 and published today, 29 March 2026.
Where: The decisions span the European Economic Area, with lead supervisory authorities from Estonia, Sweden, Spain, Belgium, France, Czech Republic, Malta, Lithuania, Germany, Norway, Poland, and other Member States. The structural issues identified have relevance across all 27 EU Member States plus the EEA.
Why: The digest was commissioned to provide supervisory authorities and practitioners with a consolidated view of how Article 6(1)(f) GDPR has been applied in cross-border cases, identifying common failure modes, novel legal issues, and tensions between the GDPR and ePrivacy Directive. For the marketing and advertising industry, the digest is significant because many common data processing practices - including behavioural targeting, data broker purchasing, cookie-based profiling, and fraud prevention - have been directly addressed in the decisions analysed.