German data protection authorities call for enhanced GDPR protections for children
German regulators propose ten amendments to strengthen GDPR children's data protection including consent restrictions for profiling and advertising uses.
Germany's independent data protection authorities issued a resolution on November 20, 2025, calling for significant amendments to the General Data Protection Regulation to strengthen protections for children across digital services. According to the Conference of Independent Data Protection Supervisory Authorities of the Federal Government and the Länder, the existing regulation provides insufficient safeguards for minors despite acknowledging their vulnerability.
The 38-page resolution identifies ten specific provisions within GDPR that require modification to adequately protect children's fundamental rights. The authorities emphasized that current data protection rules fail to account for children's structural disadvantages in understanding long-term consequences of data processing decisions.
Current GDPR provisions fall short
The regulation currently includes only six explicit provisions addressing children's data protection needs. Article 8 establishes age 16 as the threshold for valid consent to information society services, though member states can lower this to 13. Nine European Union countries set the limit at 13, six at 14, four at 15, and nine retain the 16-year threshold.
According to the resolution, children face particular risks because they "often do not fully understand the long-term disadvantages of the processing of their personal data, but are very open to the mostly short-term positive effects of using data processing systems and services." The document notes that data collected from children shapes their worldview, influences social relationships, and enables behavioral predictions.
Article 6(1)(f) requires balancing tests to account for children's interests when processing relies on legitimate interests. Article 12(1) mandates information be provided "in a concise, transparent, intelligible and easily accessible form, using clear and plain language" particularly for children. Article 17(1)(f) requires deletion of children's data collected based on consent under Article 8(1).
The supervisory authorities determined these selective protections lack coherence. "Behind the few regulations, there is no overall concept that provides those responsible with clear rules on the situations in which children's rights must be taken into account and the legal consequences of doing so," according to the resolution.
Proposed restrictions on profiling and advertising
The authorities propose prohibiting children's consent to data processing for advertising purposes and personality or user profile creation. According to the resolution, Recital 38 of GDPR states children deserve special protection concerning "the use of children's personal data for advertising purposes or for the creation of personality or user profiles," but this principle never made it into the regulation's binding text.
The proposed Article 8 amendment would insert language stating "the processing of a child's personal data for advertising purposes and for the creation of personality and user profiles is not permitted." This complements Article 28(2) of the Digital Services Act, which prohibits displaying personalized advertising to children.
The advertising restriction would eliminate the ability to build personality or user profiles based on children's data. Contextual advertising for children's products like games and toys could continue, but targeting based on behavioral profiles would face prohibition.
For Article 9 special category data, the authorities propose requiring that children possess sufficient maturity before consenting to processing of sensitive information. The amendment would establish that consent to processing special categories of personal data is permissible only when "clearly not contrary to the child's best interests, within the limits of the maturity required for the decision."
Prevention services and medical care access
The resolution addresses Recital 38's objective that parental consent "should not be required in connection with prevention or counseling services offered directly to a child." This principle has not been implemented in regulatory text, leaving children unable to access psychological help, addiction counseling, or pregnancy counseling without parental knowledge.
The proposed Article 9(2)(a) addition would permit explicit consent from children over age 12 "to the processing of personal data in connection with prevention or counseling services offered directly to a child and in connection with medical examinations or therapeutic interventions" without parental consent when children possess necessary maturity.
The authorities emphasized preventing misuse requires limiting such processing to "recognized prevention or counseling services in the public interest, only for the purposes of those services, and only to the extent necessary for those services."
Enhanced objection rights and automated decision protections
The resolution proposes strengthening children's objection rights under Article 21. Current wording requires data subjects to demonstrate grounds relating to their "particular situation" when objecting to processing based on legitimate interests or public interest tasks.
The amendment would add language specifying "especially where the personal data concern a child" to Article 21(1). This modification aligns with Article 6(1)(f) requirements to give particular weight to children's interests when conducting balancing tests.
For automated decision-making under Article 22, the authorities propose explicitly excluding children's consent as a valid basis for processing. Recital 71 states "this measure should not apply to children," but the regulatory text permits automated decisions based on "explicit consent of the data subject" without age restrictions.
The proposed change would modify Article 22(2)(c) to require "explicit consent of the adult data subject," preventing children from validly consenting to automated decisions that produce legal effects or similarly significant impacts.
Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.
System design and default settings requirements
The resolution calls for Article 25 amendments emphasizing children's protection in data protection by design and by default obligations. Controllers must implement technical and organizational measures incorporating necessary safeguards to meet GDPR requirements.
The proposed Article 25(1) addition would require controllers to pay "particular attention to the protection of the rights of children" when designing processing systems. According to the authorities, fundamental child protection in system design proves "particularly important in social networks and other offerings with data-driven business models."
For privacy-friendly default settings under Article 25(2), the amendment would add language requiring "default settings take particular account of the vulnerability of children." The resolution notes children accept default settings more readily than adults and concentrate solely on using devices or services.
Children cannot be expected to recognize default settings or understand their significance for informational self-determination, according to the authorities. They depend particularly on default settings avoiding data protection risks.
Data breach notification and impact assessments
Article 33 data breach notification requirements currently exempt controllers when breaches are "unlikely to result in a risk to the rights and freedoms of natural persons." The proposed amendment would add "taking into particular account the risk to children" to the risk assessment criteria.
The authorities emphasized that even knowledge of children's names and locations can pose significant risks. Specific risks to children should be explicitly highlighted in breach assessments rather than treated as general factors.
For data protection impact assessments under Article 35, the resolution proposes multiple additions emphasizing children's rights. Article 35(1) would add language requiring explicit consideration of "risks and consequences that the processing may have for their specific rights" when children are affected.
Article 35(7) assessment requirements would include language directing controllers to take "particular account of the personal data of a child" when evaluating risks and to consider "rights and legitimate interests of data subjects and other persons concerned, in particular children" when determining protective measures.
Processing purpose compatibility and broader context
The authorities propose modifying Article 6(4) compatibility assessments for purpose changes. When personal data collected for one purpose will be processed for another purpose, controllers must evaluate whether the new purpose is compatible with the original.
The proposed amendment would add "in particular if it concerns the personal data of a child" to Article 6(4)(1)(d) factors for assessing possible consequences of further processing. If children's data faces repurposing, compatibility determinations should be more restrictive than for adult data.
Article 40(2)(g) permits associations to regulate education and child protection in codes of conduct, while Article 57(1)(b) tasks supervisory authorities with raising awareness about processing risks with "specific measures for children." The resolution notes these provisions show GDPR's recognition of children's vulnerability but lack systematic implementation.
Marketing implications and enforcement gaps
The resolution carries significant implications for digital advertising and marketing technology. Platforms collecting data from users under 18 face potential restrictions on profiling, behavioral targeting, and automated decision systems even when parental consent is obtained.
Recent GDPR enforcement developments show regulators examining how personal data processing rules apply to artificial intelligence systems and advertising technologies. The European Commission proposed major GDPR amendments in November 2025 addressing AI development and individual privacy rights, though those proposals have not specifically focused on child protection enhancements.
Germany previously pushed for sweeping data protection simplification in an October 2025 policy document emphasizing AI training exemptions and reduced access rights obligations. The November child protection resolution represents a distinct approach, calling for strengthening rather than relaxing regulatory requirements in specific contexts.
Marketing professionals operating social networks, gaming platforms, educational technology services, and other offerings directed at children would face enhanced compliance obligations. Default privacy settings would require design specifically accounting for children's vulnerability, while consent mechanisms for users under 18 would become invalid for profiling and advertising purposes.
The authorities noted that "depending on their level of maturity, children are less able than adults to avoid the risks of their data being processed and to defend themselves against infringements of their fundamental rights." This structural vulnerability justifies stronger protective measures than those required for adult users.
European regulatory coordination and implementation challenges
The Conference of Independent Data Protection Supervisory Authorities of the Federal Government and the Länder represents Germany's federal structure for data protection oversight. The conference includes authorities from individual states and the federal government, coordinating enforcement approaches across jurisdictions.
The European Data Protection Board clarified Digital Services Act compliance on September 11, 2025, addressing how platforms must navigate overlapping DSA and GDPR requirements. The guidance emphasized that children receive enhanced protection under both regulatory frameworks, with Article 28 DSA mandating high privacy, safety, and security levels for minors.
Implementation of the proposed amendments would require action at the European Union level. Individual member state authorities cannot unilaterally modify GDPR provisions, which function as directly applicable regulations across all EU jurisdictions.
The resolution coincides with broader discussions about GDPR reform. The European Commission's Digital Omnibus initiative includes amendments narrowing personal data definitions and establishing AI development as a legitimate interest for data processing, though child protection enhancements are not part of current Commission proposals.
Platform-specific child protection measures have emerged separately from regulatory changes. Google consolidated five advertising policies into a comprehensive framework in January 2025, restricting sensitive categories and disabling ad personalization for users under the digital age of consent.
The authorities acknowledged that current GDPR protections appear in scattered provisions without systematic implementation. Article 24 of the EU Charter of Fundamental Rights and the UN Convention on the Rights of the Child establish children's special protection needs, but translation into operational data protection rules remains incomplete.
Technical implementation requirements
The proposed amendments create specific technical obligations for controllers processing children's data. Privacy by design requirements under Article 25 would mandate systems architecture accounting for children's vulnerability from initial development stages.
Social networks face particular scrutiny. The resolution notes these platforms with data-driven business models must implement fundamental child protection measures during system design. Default privacy settings cannot rely on children recognizing and modifying configurations to protect their interests.
Age verification mechanisms become critical for compliance. Controllers must determine whether data subjects are children to apply appropriate protections. Google began deploying machine learning age detection in the United States during 2025, using signals like search queries and content consumption patterns to identify users who may be minors.
The UK modernized data protection with new automated decision frameworks in 2025 legislation, though that approach emphasized simplification rather than enhanced child protections. The German authorities' resolution takes the opposite direction, calling for stricter requirements when children's data faces processing.
Data minimization principles gain heightened importance for children's data. Controllers would need to demonstrate particular necessity and proportionality justifications for collecting and processing information from users under 18.
International child protection developments
The resolution emerges amid global regulatory attention to children's online safety. Australia implemented the world's strictest social media ban for users under 16 on November 29, 2024, prohibiting account creation on major platforms starting December 10, 2025.
United States COPPA rules underwent major amendments taking effect June 23, 2025, requiring separate consent for third-party data sharing and enhanced transparency disclosures. Those changes fundamentally altered digital advertising and data collection practices for services directed at children under 13.
The European approach through GDPR amendments differs from jurisdiction-specific legislation. Modifications to the regulation would apply uniformly across all EU member states rather than creating patchwork requirements varying by country.
French authorities established comprehensive AI system development guidelines in July 2025, while German data protection authorities published technical requirements for AI systems across development lifecycles. These frameworks provide practical GDPR compliance guidance as machine learning applications expand.
The resolution authors emphasized practical enforcement challenges. Controllers often fail to recognize or fulfill obligations to consider children's special vulnerability when applying existing GDPR provisions. Explicit regulatory text would reduce ambiguity and improve compliance rates.
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Timeline
- November 20, 2025: Conference of Independent Data Protection Supervisory Authorities issues resolution calling for enhanced GDPR child protections
- November 15, 2025: German regulator approves Microsoft 365 for GDPR compliance after three-year negotiation process
- November 2025: European Commission proposes major GDPR amendments through Digital Omnibus initiative addressing AI and data processing
- October 23, 2025: Germany submits comprehensive proposal calling for GDPR modifications including AI training exemptions
- September 11, 2025: European Data Protection Board clarifies DSA compliance for digital marketing and platform operations
- July 22, 2025: French CNIL finalizes AI system development recommendations under GDPR compliance requirements
- July 8, 2025: GDPR record-keeping exemption expands to companies with under 750 employees using risk-based assessment
- June 18, 2025: Research establishes large language models qualify as personal data under European Union privacy regulations
- January 15, 2025: Google tightens advertising rules to protect minors by consolidating five policies into comprehensive framework
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Summary
Who: The Conference of Independent Data Protection Supervisory Authorities of the Federal Government and the Länder, representing Germany's coordinated federal and state-level data protection oversight structure, issued the resolution affecting controllers processing children's personal data.
What: A resolution proposing ten specific amendments to the General Data Protection Regulation to strengthen child protections, including prohibiting children's consent for profiling and advertising purposes, enhancing objection rights, requiring explicit consideration in data protection impact assessments, and mandating privacy-by-design measures accounting for children's vulnerability.
When: Published November 20, 2025, the resolution addresses gaps in the current GDPR framework that has been in effect since May 25, 2018, calling for amendments that would require European Union legislative action to implement.
Where: The resolution applies to the European Union's data protection framework, affecting all 27 member states and any controllers or processors handling personal data of children in EU jurisdictions regardless of the organization's physical location.
Why: Children face structural vulnerabilities in understanding long-term consequences of data processing, often cannot recognize or exercise their rights as data subjects, and are particularly susceptible to short-term incentives for using data processing services, requiring enhanced regulatory protections beyond those provided to adults under existing GDPR provisions.