A talent acquisition professional based in Europe has sparked a wide online discussion after using a GDPR Subject Access Request to obtain her interview scorecards, emails, and internal notes following a hiring process that ended without feedback. The LinkedIn post, published approximately one week ago and authored by Amanda Lamont, a talent acquisition and people and culture specialist, drew 568 reactions and 93 comments within days. It offers a rare, documented look at how Article 15 of the General Data Protection Regulation functions in practice during recruitment - and what happens when the results are read.

The case is straightforward in technical terms but striking in its implications. Lamont had completed six hours of interviews and a case study for a role. No feedback came. According to her own account, she waited a few months before submitting a Subject Access Request, citing a deliberate choice to let time pass before exercising the right. The company, which she identified as Zalando, took the full 30-day response period permitted under the GDPR before supplying the data.

What the documents revealed was a unanimous yes from the entire interview panel. The blocker, according to a director's side note visible in the returned records, was that Lamont had worked at a startup following her time at Zalando. That gap - the absence of a second recognizable brand name after the first - was enough for the director to override a panel consensus. The director had also voted yes, yet still permitted the concern to become determinative.

What Article 15 actually covers

Article 15 of the GDPR gives any individual the right to confirm whether a controller - meaning an organization that determines the purposes and means of processing personal data - holds personal data about them, and to receive a copy of that data alongside specific categories of information. According to the regulation text published by Intersoft Consulting, controllers must disclose the purposes of processing, the categories of personal data concerned, the recipients or categories of recipient to whom the data have been or will be disclosed, the envisaged storage period, the existence of rights to rectification or erasure, the right to lodge a complaint with a supervisory authority, and, where data was not collected directly from the data subject, any available information about its source.

The regulation also requires disclosure of any automated decision-making, including profiling under Article 22, along with "meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject." That last clause carries particular weight in recruitment contexts where AI screening tools are used.

The controller must provide a copy of the personal data under processing. For any further copies beyond the first, a reasonable fee based on administrative costs may be charged. Where a request is made electronically, the information must be supplied in a commonly used electronic format unless the data subject requests otherwise. The right to obtain a copy must not adversely affect the rights and freedoms of others - a provision that becomes relevant when third-party personal data, such as interview notes written by panel members, is included in the returned documents.

The standard response window is 30 days. This is not a soft deadline. Controllers that fail to respond within the period face potential complaints to national supervisory authorities and, depending on jurisdiction, administrative fines. Lamont's case took the full month.

The legal landscape around Article 15 is shifting

The timing of the LinkedIn discussion matters. It arrived as European courts and institutions are actively debating the scope and limits of GDPR access rights. On April 16, 2026, Advocate General Rimvydas Norkus delivered an opinion before the Court of Justice of the European Union in Case C-205/25, a dispute over whether data protection supervisory authorities are themselves subject to Article 15 obligations when they hold personal data from complainants. PPC Land reported on this opinion when it was published, noting that the Advocate General concluded that supervisory authorities do qualify as controllers and cannot create blanket exemptions to access rights through national legislation. The CJEU's final ruling is pending.

The Belgian Data Protection Authority published a 15-page citizen-focused brochure in April 2026 on AI and privacy, covering the full range of data subject rights including Article 15. That brochure, as covered by PPC Land, noted that the right of access enables individuals to confirm whether their data is being processed and to obtain a free copy "if necessary to understand the context of processing." The document explicitly flagged Article 15(1)(h), which covers the right to meaningful information about automated decision-making logic - a provision directly relevant to any hiring process using AI scoring tools.

Meanwhile, the European Commission's Digital Omnibus initiative, circulated in draft form in November 2025 and reported by PPC Land, proposed amendments that would narrow certain access rights. Privacy advocates warned at the time that the changes could undermine Subject Access Requests used specifically for employment disputes and consumer litigation. The European Data Protection Board's coordinated enforcement framework on access rights, published earlier in 2025, identified widespread controller non-compliance as the primary problem - not, the EDPB emphasized, abusive use by data subjects. The Commission's sweeping proposals remain under negotiation across member states.

Bias encoded into systems

The discussion in Lamont's comments quickly moved from her individual case to broader structural questions. Several commenters drew explicit links between the director's reliance on brand recognition as a proxy for competence and the way AI hiring systems are trained.

According to Alicja Copija, a career coach commenting on the post, "research has long shown that past companies are among the weakest predictors of hiring success yet for many hiring managers, such labels still get mistaken for actual evidence of competence." Christopher Case, who leads organizational development and AI solutions work, put it more directly: "the parameters we feed those systems will carry the same assumptions forward at scale." The implication is that when AI screening tools are calibrated against historical hiring data that privileged brand pedigree, the tool does not reduce the bias - it systematizes and accelerates it.

Lamont herself made this argument explicitly: "tools reflect the parameters we set. If the criteria feeding those systems still defaults to logo recognition as a proxy for quality, we're not reducing bias. We're just giving a computer a way to make the same exclusionary decisions more efficiently."

This connects directly to Article 22 of the GDPR, which restricts automated individual decision-making that produces legal effects or similarly significant impacts. Controllers must ensure meaningful human involvement in consequential decisions. But as several commenters pointed out, Lamont's case shows that human involvement is not a safeguard if the human simply inherits the same bias the algorithm was built to replicate. The Dutch Data Protection Authority published a consultation in June 2025 on social scoring and algorithmic bias, covered by PPC Land, which identified a specific cognitive risk it called "automation bias" - the tendency for human reviewers to follow algorithmic recommendations rather than exercise independent judgment.

Article 15(1)(h) adds another dimension. If a company uses AI scoring during a hiring process and that scoring influences who advances, the data subject has a right to meaningful information about how that system works and what consequences it produced for them. In practice, very few candidates know this right exists or how to invoke it.

What a SAR actually returns - and what it does not

Jamie Kohn, a labor market analyst commenting on the post, raised an important technical limitation: a Subject Access Request returns only what the controller has recorded. It does not tell a candidate what competing candidates' profiles looked like, whether a stronger candidate was offered the role, or whether the position was pulled late in the process. "The results can often be underwhelming," according to John Sanger, a charity leadership recruiter who also commented.

What a SAR does return - when controllers comply properly - is revealing in a different way. It surfaces the data actually held about the individual: interview notes, scoring rubrics, internal communications in which the candidate is mentioned. Lamont noted that one company she tested the process with, as part of a wider experiment across three organizations, included Slack messages in which she was referenced. Controllers are required to supply personal data held in any form, including communications on internal messaging platforms, provided those messages contain personal data concerning the data subject.

The 30-day response window, the obligation to provide data in a commonly used electronic format, and the prohibition on conditioning access on the requester's purpose - a point clarified by the Court of Justice in Case C-307/22 and confirmed again in the pending C-526/24 Brillen Rottler proceedings - together make the Subject Access Request one of the more operationally consequential tools available to individuals engaging with EU-established organizations.

Mark Sheppard, a talent acquisition business partner, and Brett Williams, an executive search specialist, both noted in comments that North American hiring practices do not offer equivalent mechanisms. According to Williams, "there are many liberties in the EU that US-based employers are relieved haven't hit their shores yet."

Why this matters for the marketing and advertising community

The marketing industry intersects with this issue at two points. First, marketing technology companies that process candidate data during hiring are subject to the same GDPR obligations as any other EU-established controller. That includes adtech platforms, agencies, and publishers operating in Europe. A talent acquisition professional operating in Europe would be covered by the same framework as any candidate for a role at a DSP or publisher.

Second, the structural question about AI and bias has a direct parallel in how marketing systems are built. PPC Land has reported extensively on the regulatory debate over whether AI model development constitutes a legitimate interest under Article 6(1)(f) GDPR. The draft Digital Omnibus proposals would, if enacted, make it easier to train AI systems on personal data. Critics of that approach, including privacy organization noyb, have warned that most targeted advertising already relies on statistical inferences about user characteristics. The logic is similar to the one Lamont's case illustrates: systems trained on biased inputs produce biased outputs, and the speed and scale of automation can amplify rather than correct that problem.

The Austrian Supreme Court ruling issued November 26, 2025, covered by PPC Land, delivered a related principle in the context of Meta's advertising data. After an 11-year legal case initiated by privacy advocate Max Schrems, the court ordered Meta to supply full access to personal data including information about data sources, recipients, and processing purposes - and rejected Meta's trade secret arguments as a basis for withholding that information. The ruling established that commercial confidentiality interests do not override fundamental access rights when organizations lack valid legal grounds for the limitations they impose.

German courts, as PPC Land reported in July 2025, have awarded monetary damages for GDPR violations in data processing contexts. The Leipzig Regional Court ruled that Meta Pixel violated GDPR by collecting data from non-logged-in users without valid legal basis. That decision signaled that EU courts are willing to treat systematic data processing failures as compensable harms, not merely compliance issues to be managed administratively.

The accountability gap in hiring processes

Debbie Ben-Nun, a skilling and workforce expert commenting on the post, described the situation as one where "the wrong people are getting hired for the wrong reasons" - a claim she linked to the frequency with which roles she observed opened and then, six months to a year later, re-opened. It is a pattern without definitive causal evidence in the thread, but one that resonates with research on the gap between candidate evaluation criteria and job performance predictors.

Leandro D'Andrea, a product designer, made an observation about the limits of transparency even within a successful SAR: candidates can request data that is recorded, but "never their thoughts and feelings." Internal deliberations that never become written records remain inaccessible. Lamont acknowledged the same point. One company in her three-organization test, she noted, had excluded some Slack messages that could have been expected to contain personal data. Verification of completeness is practically difficult for the data subject.

Jessica Celi, a talent strategy specialist, offered a different frame: "even if no candidate ever requests the info, I think the fact they can helps us be better recruiters." The SAR mechanism, on this reading, functions as a structural accountability prompt. The question "what would it look like if this discussion were made public" operates as a check on decision-making quality, independent of whether anyone actually submits a request.

Toby Barnes, an executive director focused on talent, identified a governance failure that sits upstream of the access rights question: "what was the point of the interview panel if they can all be outvoted by their boss." A unanimous panel decision overridden by a single directorial instinct is not a data problem. It is a decision-making culture problem. The Subject Access Request exposed it. The GDPR did not create it.

Timeline

Summary

Who: Amanda Lamont, a talent acquisition and people and culture specialist based in Europe, with comments from a broad range of hiring professionals, legal commentators, and organizational development experts.

What: Lamont submitted a GDPR Article 15 Subject Access Request after a silent rejection from a hiring process involving six hours of interviews and a case study. The returned documents showed a unanimous yes from the interview panel and a director's side note citing a post-Zalando startup period as the decisive factor. The case generated substantial online discussion about brand pedigree bias, AI in hiring, and the structural accountability implications of data access rights.

When: The LinkedIn post was published approximately one week before April 26, 2026. The hiring process and SAR submission preceded the post by several months, with the company taking the full 30-day statutory response period to supply the data.

Where: The hiring process involved Zalando, a German-headquartered e-commerce company subject to GDPR. The GDPR Article 15 right applies across all EU member states.

Why: The case matters because it illustrates, with documentary evidence, that GDPR access rights can function as a check on hiring bias - and that the mechanism is largely unknown to candidates who might use it. It also raises questions about AI in recruitment: if historical hiring decisions embedded brand pedigree as a quality signal, AI tools trained on that data will replicate the same assumptions at scale, with or without human involvement in the loop.

Share this article
The link has been copied!