Spain's data protection authority this month published a resolution imposing a €500,000 administrative fine on Fútbol Club Barcelona for failing to conduct a legally compliant data protection impact assessment (DPIA) before processing biometric data from its roughly 143,000 members during a 2023 digital census update campaign. The fine, set out in file EXP202305134 of the Agencia Española de Protección de Datos (AEPD), marks one of the most detailed public rulings in Spain on what constitutes a deficient DPIA under the General Data Protection Regulation.

The sanction covers a violation of Article 35 of the GDPR, classified as serious under Article 73(t) of Spain's national data protection law, the LOPDGDD. A separate count - an alleged breach of Article 9 on special category data - was archived, meaning the authority concluded that proceedings on that point should not continue.

What the census update involved

FCB launched its digital census renewal on 21 March 2023. The campaign was described by the club as mandatory under its statutes and was promoted via email to all members with a link to an online process. According to the resolution, the digital workflow asked members to complete eight steps: log in with their personal code, select their country and identity document type, scan their identity document to camera, take a facial selfie, record their voice for five seconds, fill in a data form, and verify their profile.

The facial comparison step required members to fit their face into a frame and move their head slowly in directions indicated by on-screen arrows - a liveness detection procedure, not merely a static photograph. The system then compared a biometric vector extracted from the selfie against a vector derived from the photo on the identity document. This comparison was carried out using software provided by Veridas Digital Authentication Solutions S.L., a Spanish company specialising in biometric identity verification.

Biometric vectors generated during the comparison were encrypted and retained for seven days on servers belonging to SIA, a third-party processor, located within the European Economic Area. Veridas itself stored no biometric data on its own servers. After the seven-day window, the vectors were deleted automatically.

A total of 124,864 members updated their census records during the campaign. Of these, 96,343 completed the process digitally and 10,138 did so in person at club offices. Among those who completed the digital process, 72,187 accepted creation of a digital voice profile. In the in-person channel, only 160 out of 12,249 opted to create a voice record.

Three complainants, one investigation

The AEPD received three separate complaints about the census process, all filed in 2023. According to the resolution, the first complaint arrived on 21 April 2023. The complainant, a club member, described initiating the process expecting only a simple selfie but finding that the system required head movements consistent with biometric capture - movements for which, the member argued, no explicit consent had been requested. The process could not be completed without providing facial biometric data, the complainant stated.

The AEPD notified FCB of the first complaint on the same day it was received, 21 April 2023. The club responded on 22 May 2023. Following an internal review, FCB suspended the biometric process before the AEPD formally transferred the complaint - a step the club later cited in its defence as evidence of good faith.

The DPIA Barcelona submitted - and why it failed

The central question in the resolution was whether FCB had conducted a valid DPIA before launching the census update. FCB argued that it had. The club described performing a three-level risk analysis prior to deployment. At the first level, it checked whether the processing appeared on the GDPR's mandatory DPIA list under Article 35.4 and 35.5. At the second level, it assessed whether the processing involved large-scale systematic monitoring. At the third level, it applied a questionnaire covering the nature, scope, context and purposes of the processing, the technologies involved, data transfers, and third-party providers. FCB concluded that the facial comparison system "does not present significant risks," with 14 out of 17 risk categories rated as low risk and three rated as negligible.

The AEPD examined the document FCB submitted as its DPIA. The resolution notes the document lacked a title. More substantially, the authority found that it did not meet the formal and substantive requirements of Article 35.7 of the GDPR, which specifies that a DPIA must contain at minimum: a systematic description of the processing and its purposes; an assessment of the necessity and proportionality of operations relative to those purposes; an assessment of the risks to individuals' rights and freedoms; and the measures envisaged to address those risks.

According to the AEPD's analysis, the document submitted by FCB failed on several of these counts. In the section intended to satisfy Article 35.7(a) - systematic description of processing - the document listed data categories as "name, surname, ID number, member number, image (photograph)" without explicitly identifying facial biometric data derived from the selfie or from the identity document image. The AEPD found this omission significant: a DPIA covering biometric facial processing that does not clearly identify the facial biometric data in its data inventory cannot be considered adequate.

The proportionality analysis required under Article 35.7(b) was found to be insufficient. Rather than evaluating whether the biometric approach was the least privacy-intrusive method available for verifying member identity, the club's document treated biometric comparison as already established and then assessed proportionality within that assumption. The AEPD considered this circular. The authority noted that alternatives to facial biometrics were available - such as in-person verification, which some members used - and that a legally adequate DPIA must genuinely assess these alternatives rather than confirm a prior decision.

The risk assessment section, intended to meet Article 35.7(c), identified risks at a general level but did not, in the AEPD's view, reflect the specific high-risk nature of biometric data processing as defined in Article 9 of the GDPR. The authority observed that the risks were assigned low or negligible ratings in ways that appeared to underweight the intrinsic sensitivity of biometric identifiers. The AEPD concluded that the analysis documents were "clearly defective and not in accordance with the GDPR and could not be considered as DPIAs by merely having the formal content provided for in the GDPR."

FCB advanced two principal legal arguments in its written submissions. The first was that, at the time it designed the census process in 2022 and early 2023, the AEPD's own published guidance treated biometric data used for authentication - rather than identification - as falling outside the scope of Article 9 special category protections. The club argued that a change in the authority's position should not apply retroactively to processing already underway, invoking the legal principles of legitimate expectation and legal certainty.

The AEPD rejected this argument in relation to the Article 35 violation. The obligation to conduct a DPIA before processing biometric data was already established in the GDPR's text before any guidance document existed. According to the resolution, FCB's own documentation showed awareness that it was processing biometric data: the Veridas contract and internal records referred explicitly to biometric vectors. The authority concluded that the question of whether the data required Article 9 consent - a separate legal question - did not affect the Article 35 obligation, which applies to processing "likely to result in a high risk" regardless of its legal basis.

FCB's second argument was that it should benefit from mitigating circumstances, including its voluntary suspension of processing upon receiving the first complaint, its absence of financial gain, and its cooperation with the AEPD throughout the investigation. The authority accepted these as relevant mitigating factors in calculating the fine, but found that they did not negate the infringement. On the Article 9 count, however, the AEPD decided to archive proceedings. This outcome reflects the legal complexity of the authentication-versus-identification distinction that FCB raised - a distinction that has been the subject of evolving regulatory interpretation across Europe.

How the €500,000 figure was reached

The maximum fine for a violation of Article 35 under Article 83.4(a) of the GDPR is €10 million or 2% of total annual worldwide turnover, whichever is higher. For FCB, a private non-profit sports association, the authority applied the fixed €10 million ceiling as the reference point. The €500,000 sanction represents 5% of that maximum.

According to the resolution, factors that influenced the fine level included the number of individuals whose data was processed - 124,864 - which the authority characterised as a considerable number without reaching large-scale status as formally defined. The negligent rather than intentional nature of the infringement was treated as a mitigating factor. FCB's status as a non-profit entity with no commercial purpose behind the census update was also noted. The club's proactive suspension of the process and its cooperation throughout the investigation counted in its favour.

Why this matters for data and marketing professionals

The Barcelona case connects directly to a pattern of biometric enforcement that the AEPD has been building for several years. The same authority fined Spain's airport operator AENA €1.8 million in November 2025 for deploying facial recognition across multiple airports without conducting adequate impact assessments - another Article 35 violation where the authority found the DPIAs formally present but substantively inadequate. Before that, the AEPD issued a formal warning to Tools for Humanity over planned iris-scanning operations in Spain, again centred on DPIA quality.

The repeated focus on DPIA adequacy - not the absence of a document but the inadequacy of its content - signals a prosecutorial posture that has direct implications for any organisation deploying biometric or other high-risk technologies as part of marketing, loyalty, or identity programmes. A written risk assessment that reaches favourable conclusions through a questionnaire framework, without systematically describing the processing, without honestly evaluating alternative approaches, and without proportionate risk scoring, will not satisfy a regulator examining it closely.

For marketing professionals operating identity verification systems, audience measurement tools using device biometrics, or behavioural analytics that could be characterised as systematic monitoring, the Barcelona resolution offers a detailed specification of what regulators consider the minimum substantive standard. According to the AEPD, the DPIA must be completed before the processing begins - not as a compliance audit after the fact - and must document genuine consideration of privacy-preserving alternatives.

The case also touches the consent architecture question that has dominated European data protection enforcement for several years. The EDPB's 2024 opinion on facial recognition at airports emphasised that individuals must have maximum control over their biometric data and that organisations should explore alternatives wherever possible. The Barcelona census process - where the digital channel required biometric submission while an in-person alternative was available but not prominently presented - illustrates precisely the design tension that regulators are examining.

The resolution arrives as European institutions continue debating GDPR's scope. The European Commission's Digital Omnibus proposals circulated in November 2025 would narrow special category data definitions and introduce new AI training exemptions, while a Madrid court ordered Meta to pay €479 million in November 2025 for GDPR violations related to advertising data processing. Against that backdrop, the Barcelona fine demonstrates that national authorities are enforcing existing rules with increasing technical specificity, regardless of the legislative reform discussion in Brussels.

Separately, the AEPD in February 2026 published a 71-page guide on agentic AI and GDPR compliance, signalling that the authority's analytical focus has expanded beyond biometric deployments to the broader class of high-risk automated systems. Organisations that have treated DPIA as a formality - a document to file rather than an analytical process to complete - face a regulatory environment in which that approach carries measurable financial risk.

Timeline

Summary

Who: Fútbol Club Barcelona (NIF G08266298), a private non-profit sports association with approximately 143,000 members, sanctioned by Spain's Agencia Española de Protección de Datos (AEPD). The biometric technology was supplied by Veridas Digital Authentication Solutions S.L., with data processed by a third-party processor called SIA.

What: A €500,000 administrative fine for violating Article 35 of the GDPR - the requirement to conduct an adequate data protection impact assessment before processing biometric data. The AEPD found that the DPIA FCB submitted was formally present but substantively deficient: it did not systematically describe the biometric data categories, did not genuinely assess proportionality against less intrusive alternatives, and did not produce a risk evaluation proportionate to the sensitivity of biometric identifiers. A separate count under Article 9 on special category data was archived.

When: The census campaign ran from 21 March 2023. Complaints were filed in April 2023. The proposed resolution was issued on 4 November 2025, FCB submitted objections on 21 November 2025, and the final resolution was published today, 4 March 2026.

Where: The processing occurred digitally via FCB's website and member app, with biometric data handled by servers within the European Economic Area. The AEPD is headquartered at Calle Jorge Juan 6, 28001 Madrid.

Why: FCB designed the digital census workflow around facial comparison technology - matching a live selfie against an identity document photo - to enable remote verification of roughly 143,000 members dispersed across geographic locations. The club conducted a risk analysis before deployment but the AEPD determined that analysis did not meet the substantive standard required by Article 35.7 of the GDPR for processing involving biometric data. The authority concluded the assessment was inadequate in its data description, proportionality analysis, and risk scoring methodology.

Share this article
The link has been copied!