The Bundesverband Digitale Wirtschaft (BVDW), Germany's digital economy association, this month published a position paper calling for a fundamental shift in how Europe approaches the protection of children and young people on the internet. Released on February 10, 2026, to coincide with Safer Internet Day, the nine-page document argues that outright bans on platform access are both ineffective and counterproductive. The organization's central claim is straightforward: digital participation for minors should be protected, not eliminated.
The paper, titled "Jugendschutz im digitalen Raum: Schutz durch Befähigung" - Youth Protection in Digital Spaces: Protection through Empowerment - rests on two pillars. The first is legal harmonization at the European level. The second is media literacy education. Together, the BVDW argues, these form the only realistic foundation for child safety online. What makes the paper notable is not just its policy content, but the public opinion data it draws on.
Survey results challenge the case for bans
A Civey survey, commissioned by the BVDW and conducted from February 3 to 4, 2026, polled 2,500 German adults aged 18 and over. The methodology applied quotas and weighting to ensure representative results, with a stated statistical margin of error of 3.5 percentage points. The question posed was direct: "Which measures do you consider most effective for protecting children and young people on the internet?" Multiple answers were permitted.
The results are striking. Age-appropriate function adjustments drew support from 58 percent of respondents, making them the most popular option. Half of those surveyed - exactly 50 percent - backed the promotion of digital competencies. Age-appropriate digital offerings followed at 46 percent. A general ban on use, by contrast, received backing from only 26 percent. Among households with children present, that figure fell further still, to 21 percent.
According to BVDW President Dirk Freytag, "In a digital world, children and young people need practical, sustainable and adaptable concepts. They must place their needs at the center and should consistently strengthen digital competencies. Instead of unenforceable bans, children and their parents need real solutions. The digital economy is aware of this responsibility."
The survey data provides an empirical backdrop for a debate that has often been dominated by legislative instinct. Politicians across Europe have tended toward restriction; the German public, it appears, leans toward empowerment.
The legal architecture: DSA as a foundation
The BVDW positions the EU's Digital Services Act as the cornerstone of its preferred regulatory framework. Under Article 28 of the DSA, platforms accessible to minors are required to take appropriate and proportionate measures to ensure a high level of safety, privacy, and protection. The association argues that consistent application of this article can serve as an effective tool - pointing to the EU Commission's coordinated action against pornography platforms, initiated in May 2025, as evidence that the DSA can deliver concrete results when properly enforced.
The European Commission's July 2025 guidelines on youth protection, developed under Article 28 of the DSA, are also cited approvingly. Those guidelines address risks including grooming, harmful content, problematic and addictive behavior, and harmful commercial practices. The Commission additionally launched an action plan against cyberbullying, which the BVDW describes as "of central importance" because such measures help companies implement the legal framework and ensure consistent interpretation.
The BVDW stops well short of endorsing a patchwork of national rules. As the paper notes, youth protection in the EU has historically been designed at the national level to accommodate differing cultural, linguistic, and institutional conditions. Yet the digital single market increasingly exposes the limits of that approach. Common guidelines and minimum standards would create a coherent level of protection, reduce risk, and improve legal certainty for businesses operating across borders. At the same time, member states should retain flexibility for context-sensitive implementation.
This position fits within the broader DSA enforcement landscape that PPC Land has been tracking since the regulation became fully operational, including the European Commission's February 6, 2026, preliminary findings against TikTok for addictive design features, and the Shein investigation opened on February 17, 2026, which included concerns about child safety.
Age verification: the limits of blunt instruments
Perhaps the most technically detailed section of the BVDW paper concerns age verification. Here, the association navigates carefully between the need for robust protection and the real-world failures of existing systems. The UK's Online Safety Act, which introduced mandatory age controls for numerous platforms, is treated as a cautionary example rather than a model.
According to the BVDW paper, controls introduced under the UK framework can be trivially bypassed through VPN services. VPN providers have reportedly seen usage increases of up to 1,800 percent since the law came into force - a figure the paper cites from a BBC report. Free VPN services, the paper notes, are particularly popular among teenagers and carry their own risks through opaque data collection practices. The French data protection authority CNIL and the US National Institute of Standards and Technology (NIST) are both cited as sharing skepticism about the effectiveness and data protection compliance of common verification techniques.
Methods identified as problematic include biometric approaches, which risk discrimination, and credit card or identity document checks, which can easily be circumvented using parental data. The paper is unambiguous: where age verification is genuinely necessary, it must not become "a technical pseudoinstrument that can easily be circumvented in practice and creates new dangers for data protection and data security."
The DSA's framework is deliberately technology-neutral. Rather than prescribing a specific method, it requires procedures that are "accurate, reliable, robust, and non-discriminatory." The BVDW endorses this approach. Looking ahead, the paper identifies the EU Digital Identity Wallet and national eID systems as promising infrastructure. However, it cautions that these European solutions are not yet ready for widespread deployment, and that technology neutrality must be preserved in the interim.
A dual model is proposed: one track allowing for innovative private-sector solutions developed to European minimum standards, and a second track building on public digital identity infrastructure as it matures. Crucially, the paper flags that teenagers aged 13 to 15, who do not yet hold official identity documents, must not be disadvantaged in digital participation as a result of verification requirements.
The BVDW's position is consistent with the EDPB's Statement 1/2025, published on February 11, 2025, which laid out ten principles for age assurance under GDPR, including requirements for data minimization, least-intrusive methods, and no additional tracking or profiling. Both the BVDW and the EDPB converge on risk proportionality: mandatory age checks should be confined to contexts where genuine, high-level risks for minors exist - pornography and gambling foremost among them.
For advertising-funded publishers, the paper makes an important intervention. Platforms and editorial outlets that are not specifically directed at children but serve all age groups should not be inadvertently penalized by blanket verification requirements. The BVDW explicitly states that such publishers should retain the ability to finance content through data-protection-compliant and responsible advertising formats. This is a direct concern for the digital advertising industry: overly broad age gates could reduce ad-funded reach and reshape audience segmentation models in ways that go far beyond the stated protective objectives.
PPC Land has covered related developments, including the EU's blueprint for age verification released in July 2025, with pilot testing underway across Denmark, France, Greece, Italy, and Spain, and a mandatory EU-wide deadline at the end of 2026.
Content moderation and platform responsibility
The BVDW paper devotes significant attention to what happens once children and young people are online. Age verification is necessary but not sufficient. According to the position paper, "youth protection in the digital space encompasses not only protection from unauthorized access, but also from harmful, unwanted, and content explicitly intended for adults."
The DSA, in the BVDW's analysis, establishes the first unified European legal framework for this broader challenge. Specifically, it obliges platforms to exclude personalized advertising directed at minors, assess and minimize risks such as manipulation or cybergrooming, and restrict access to illegal or unsuitable content. The paper highlights the DSA's Trust & Safety structures - including rapid "notice-and-action" procedures, low-threshold and transparent reporting mechanisms, and "Trusted Flaggers" capable of accelerating the removal of critical content - as important operational building blocks.
The association also points to existing industry standards: the EU Code of Conduct against Hate Speech, the Code of Practice on Disinformation, and the "Alliance to Better Protect Minors Online." German self-regulatory bodies - FSK (film), FSM (multimedia services), and USK (entertainment software) - are identified as reference points within the national system.
Artificial intelligence receives specific mention. The paper argues that AI-based content filtering is necessary to make moderation increasingly efficient, allowing platforms to detect and remove potentially problematic content before it is viewed. Continuous investment in automated technologies, it suggests, is how platforms can fulfill their protective obligations more effectively at scale.
The role of AI in content moderation is increasingly consequential for marketers. Platforms that use AI to identify and remove content can inadvertently affect ad adjacency and campaign delivery - a dynamic that PPC Land has tracked in the context of DSA enforcement actions against X, which included concerns about AI-generated content involving minors.
Media literacy as the second pillar
Technology alone cannot protect young people. The BVDW is categorical on this point. According to the position paper, even the most effective technical protection systems and the strictest content moderation cannot guarantee seamless protection. Children and young people will encounter problematic content. The question is whether they have the tools to recognize and respond to it.
Digital and media competency is described as the "indispensable second pillar" of sustainable youth protection. That competency should begin as early as possible: the paper calls for preschool and primary school children to be introduced, in an age-appropriate manner, to topics including data protection, cybersecurity, and critical information literacy. Schools are assigned a prominent responsibility, with digital and media competency expected to become a mandatory subject supported by targeted teacher training.
Existing programs receive acknowledgment: national German initiatives "Frag FINN," "Deutschland sicher im Netz," "klicksafe," and "SCHAU HIN!" are cited as providing valuable contributions. At the European level, the "Better Internet for Kids+ (BIK+)" strategy is presented as a coordination framework for age-appropriate design and common minimum standards.
Parents and guardians are also addressed. They face the challenge of explaining a digital world in which they themselves did not grow up. The paper calls for accessible guidance, a solid basic understanding, and practical everyday tools - accompanied by a willingness to actively engage with and follow their children's media use.
Social media and the participation question
One of the more nuanced passages of the BVDW paper concerns social media specifically. According to the document, approximately 92 percent of 6- to 18-year-olds in Germany use social networks regularly. These platforms are not merely entertainment; they serve functions of identity development, political education, and participation in social processes. For marginalized groups or minorities, digital platforms can provide the only accessible space for participation that is unavailable or restricted offline.
This context shapes the BVDW's position on age-differentiated access. Rather than endorsing blanket bans - as Australia did in December 2025 when it blocked children under 16 from major platforms - the paper argues for graduated protective functions within social media. Reduced features for younger users, such as limited chat and contact options, restricted profile visibility, and stricter interaction defaults, could reduce risks without cutting off access entirely. Older teenagers would progressively gain more autonomy within defined boundaries.
The principle is that social media should provide "safe development spaces" calibrated to users' needs and risk profiles - not exclusion, but structured access. The German industry association's position stands in contrast to the direction of Australian policy and the emerging legislative trend in several US states, where access restrictions have become politically popular. Whether European regulators will follow the German industry's preferred trajectory or tighten restrictions further remains an open question.
The prohibition on personalized advertising directed at minors under the DSA is directly relevant to the advertising market. Google's January 2025 consolidation of advertising policies for children and teens and its July 2025 rollout of machine learning age detection in the US both reflect platforms adapting to these requirements. Any further tightening of the regulatory framework around minors' data and advertising will directly affect campaign targeting, publisher revenue, and audience segmentation across the European market.
Why this matters for marketers and digital publishers
The BVDW paper, though framed as a youth protection document, carries material consequences for the advertising and publishing industries. Several of its demands - for risk-proportionate age verification, for advertising-funded editorial platforms to be exempted from blanket verification mandates, for technology neutrality, and for continued room for private-sector innovation - amount to a defense of the ad-supported internet model against regulatory overreach.
Germany is the largest digital advertising market in the EU. When its primary digital trade body publishes detailed positions on the structure of age verification, content moderation obligations, and the scope of personalized advertising restrictions, those positions carry weight in Brussels. The paper was released precisely as the European Commission is demonstrating a willingness to act aggressively under the DSA - through the TikTok addictive design case, the Shein investigation, and the ongoing X proceedings. In that context, the BVDW's intervention is also a calibration exercise: trying to ensure that enforcement energy targets genuine harms rather than pulling down structures that support legitimate digital commerce.
Timeline
- 1989 - UN Convention on the Rights of the Child adopted, establishing international foundation for child protection including rights to participation (Art. 13 UN-CRC) and protection (Art. 17 UN-CRC)
- February 17, 2024 - Digital Services Act becomes fully operational for all EU platforms
- February 11, 2025 - European Data Protection Board adopts Statement 1/2025 on age assurance and GDPR-compliant age verification principles
- May 2025 - EU Commission initiates coordinated action against pornography platforms lacking effective age verification measures, as cited in the BVDW position paper
- July 2025 - European Commission publishes guidelines on youth protection under DSA Article 28, addressing grooming, harmful content, and addictive behavior
- July 14, 2025 - European Commission releases EU age verification blueprint; pilot testing begins in Denmark, France, Greece, Italy, and Spain
- July 30, 2025 - Google begins machine learning age detection rollout in the US, disabling ad personalization for users identified as likely under 18
- October 24, 2025 - European Commission preliminarily finds TikTok and Meta in breach of DSA transparency obligations
- February 3-4, 2026 - Civey conducts survey of 2,500 German adults for BVDW on child protection measures; results show only 26 percent support general bans
- February 6, 2026 - European Commission issues preliminary findings against TikTok for addictive design features under the DSA
- February 10, 2026 - BVDW publishes position paper "Jugendschutz im digitalen Raum: Schutz durch Befähigung" on Safer Internet Day
- February 17, 2026 - European Commission opens formal DSA proceedings against Shein over child safety failures and addictive design
- End of 2026 - EU Digital Identity Wallet mandatory implementation deadline across all member states
Summary
Who: The Bundesverband Digitale Wirtschaft (BVDW), Germany's digital economy association, with a survey conducted by Civey among 2,500 German adults, and public statements from BVDW President Dirk Freytag.
What: The BVDW published a nine-page position paper calling for risk-proportionate age verification, European legal harmonization under the DSA, and media literacy education as the two pillars of effective online youth protection - rejecting general platform bans as ineffective. Survey results showed that only 26 percent of the German public supports outright bans, while 58 percent favor age-appropriate platform functions and 50 percent back promotion of digital skills.
When: The position paper was released on February 10, 2026, on Safer Internet Day. The underlying Civey survey was conducted from February 3 to 4, 2026.
Where: The paper was published by the BVDW in Germany and directed at the European policy debate, particularly the EU's DSA enforcement framework and the development of European minimum standards for age verification.
Why: The BVDW acted to shape European regulatory direction at a moment of intensifying DSA enforcement, seeking to ensure that youth protection measures remain proportionate, technology-neutral, and compatible with the interests of advertising-funded digital publishers. The association argues that empowering children through media literacy and structured digital access is more effective than exclusion.