The Federal Trade Commission on February 25, 2026 published an enforcement policy statement that grants a conditional shield to operators of mixed-audience and general-audience websites that collect personal data from children for the sole purpose of determining their age. The statement does not amend the Children's Online Privacy Protection Act or its implementing rule, but it does describe a set of circumstances under which the Commission will decline to pursue enforcement - a significant practical distinction for companies building or deploying age-verification products in the United States.

COPPA's long background

Congress enacted COPPA in 1998 to protect children's privacy and to empower parents to control the collection of personal information from their children under 13. The FTC issued its first implementing rule on November 3, 1999. The rule was substantively revised in 2013 and again in 2025, with the most recent amendments - published in the Federal Register on April 22, 2025 - taking effect on June 23, 2025 and carrying a compliance deadline of April 22, 2026. Those amendments introduced, among other things, stricter requirements on consent for third-party data sharing involving children's data and an expanded definition of child-directed services.

The core problem the February 2026 statement addresses is straightforward. COPPA requires operators to obtain verifiable parental consent before collecting personal information from a child. Age-verification mechanisms themselves require collecting information - often including biometric estimates, identity signals, or behavioral inference - to determine whether a given user is a child in the first place. That creates a legal paradox: an operator cannot know whether parental consent is needed without first gathering data that may itself be covered by the very law requiring consent.

According to the policy statement, the Commission acknowledges this tension directly. A growing number of age-verification mechanisms now allow operators to determine age "more reliably than a user-provided response to an age-gating function." Simple self-declaration - typing a birth year into a form - has attracted widespread criticism for being, as one source cited in the policy document put it, "insufficient in terms of accuracy, very easy to circumvent, and clearly inadequate and inappropriate for use in high-risk situations." The Commission flagged support for more reliable alternatives in its January 2024 Notice of Proposed Rulemaking, and the February 2026 statement formalizes that support as enforcement policy.

What the statement actually says

The statement applies to two categories of operators. The first is what the FTC calls "mixed audience" websites - those directed to children but not targeting children as their primary audience. The second is general-audience sites or services that may encounter child users without specifically seeking them. Operators whose services are primarily directed to children receive no relief; according to the document, those operators must continue to treat all users as children and apply COPPA's full protections accordingly.

For the two covered categories, the Commission will not bring an enforcement action under the COPPA Rule where an operator collects, uses, or discloses personal information for the purpose of determining a user's age - what the statement calls "Age Verification Purposes" - without first obtaining verifiable parental consent. But the relief is conditional, and the conditions are specific.

The operator must not use or disclose information collected for Age Verification Purposes for any purpose other than age verification. It must share that information only with third parties that it has vetted for their ability to maintain confidentiality, security, and integrity, and must obtain written assurances that those third parties will not repurpose the data and will delete it promptly after the verification is complete. The operator itself must delete the information as soon as the verification purpose is fulfilled - the statement specifies that retention must not extend beyond "the period necessary to fulfill the Age Verification Purposes."

Beyond data handling, the operator must provide clear notice to parents and children of what is being collected for age verification purposes, publish that notice in its privacy policy, and employ reasonable security safeguards. It must also take reasonable steps to ensure that whatever product, service, method, or third party it uses for age verification is likely to produce reasonably accurate results. Finally, and critically, the enforcement shield applies only if the operator is already complying with every other requirement of the COPPA Rule with respect to personal information collected from children.

The statement defines age-verification tools broadly. According to the document, "age verification" as used in the statement refers to three categories of tools: age estimation tools that estimate a user's age or age range; age-verification tools that verify a user's age; and age inference tools that infer a user's likely age or age range based on various signals.

The Commission is explicit that the statement "does not create any substantive rights or entitlements," and that it "retains the right to investigate and bring actions for violations of the COPPA Rule in individual cases." The shield is prosecutorial discretion, not legal immunity. The statement will remain in force until the Commission publishes final rule amendments addressing age-verification mechanisms in the Federal Register, or until it is otherwise withdrawn.

The regulatory process going forward

According to the policy statement, the Commission intends to initiate a formal review of the COPPA Rule in the coming months, specifically to address age-verification mechanisms. This suggests the February 2026 statement is a bridge measure - designed to reduce compliance uncertainty while a more permanent regulatory framework is developed. The FTC revised the COPPA Rule in 2025 after six years of rulemaking; a further focused rulemaking on age verification would represent the third substantive revision since the original 1999 rule.

The timing matters. The April 22, 2026 compliance deadline for the 2025 COPPA amendments is less than two months away. Operators who have spent the past year restructuring their data collection and consent practices to meet those amendments now face an additional policy layer. The February statement tries to remove a specific barrier to age-verification adoption, but it also adds a new compliance checklist - written third-party assurances, documented deletion timelines, updated privacy notices, and accuracy assessments for the verification tools themselves.

State-level pressure and the industry context

The FTC's move comes against a backdrop of state-level legislative activity that has been accelerating for several years. According to the policy statement, various states have enacted laws requiring the use of age-verification mechanisms to access social media platforms and other content. The document cites data from the Age Verification Providers Association cataloguing state-level age assurance laws for both social media and adult content. That patchwork of state laws has created inconsistent obligations for platforms operating nationally.

The IAB raised concerns as early as March 2024 that overly broad COPPA requirements could push services to either stop catering to children or collect more data than necessary to achieve compliance - an argument that pointed in the same direction as the FTC's new enforcement statement, though from a different angle. The Commission's February 2026 position essentially accepts that some data collection during age verification is a necessary cost of protecting children at scale, provided that data is firewalled from other uses.

Platform-level responses to age-verification requirements have varied considerably. X implemented an age-assurance system in July 2025 requiring users to prove they are over 18 to access sensitive content, using third-party processors including Au10tix, Persona, and Stripe - though the system attracted criticism for being placed behind a premium subscription paywall. Bluesky announced in August 2025 that it would block access from Mississippi IP addresses rather than comply with that state's Walker Montgomery Protecting Children Online Act, citing compliance costs that reached $10,000 per user for non-compliance. Google Search began rolling out age-verification prompts in August 2025 extending verification requirements from YouTube to its primary search interface.

Internationally, the regulatory picture has grown more complex still. The European Commission published technical specifications in July 2025 for an EU-wide age-verification system tied to the Digital Services Act, with full implementation targeted by end of 2026. The UK's Online Safety Act, which imposed mandatory age checks for adult content platforms, triggered a 1,400% surge in VPN registrations when its enforcement provisions activated. Germany's BVDW digital economy association published a position paper in February 2026 calling for risk-proportionate age verification confined to platforms presenting genuine risks for minors - arguing against blanket verification mandates that could disadvantage advertising-funded publishers.

FTC enforcement activity in context

The FTC has not been passive on children's privacy during this period. The Commission sued robot toy maker Apitor in September 2025 for alleged COPPA violations stemming from the app's integration of JPush, a software development kit that collected and shared location data from child users. Disney faced a $10 million penalty in September 2025 for improperly labeling child-directed videos on YouTube and enabling targeted advertising to users under 13. TikTok faced an FTC lawsuit in August 2024 for alleged collection of personal data from children under 13 without proper parental consent.

The February 2026 policy statement sits alongside that enforcement record. It is not a relaxation of COPPA enforcement broadly, but a targeted incentive for a specific class of activity - age verification - that the Commission wants to encourage. The logic is that more accurate age determinations enable operators to apply child-protection measures more precisely, protecting more children. Operators that cannot reliably distinguish children from adults end up either treating all users as children - restricting services unnecessarily - or treating none as children and potentially exposing young users to data practices they should be shielded from.

What this means for the marketing and advertising community

For digital marketers and advertising technology practitioners, the statement has several practical implications. Mixed-audience platforms - which include a significant portion of the ad-supported internet - now have a clearer path to implementing age-verification technology without triggering COPPA enforcement for the verification process itself. That could accelerate the adoption of tools that produce more reliable audience-age signals, which in turn affects how campaigns are segmented, targeted, and measured across platforms that serve both adult and child users.

The data-use restrictions in the statement are strict, however. Information collected for Age Verification Purposes cannot be used for any other purpose - including, by implication, advertising targeting, audience enrichment, or behavioral profiling. Operators and their ad-technology partners will need to build technical and contractual firewalls between age-verification data flows and data flows used for commercial purposes. The written-assurance requirement for third parties adds a contractual layer to standard data-processing agreements, and the deletion requirement means that age-verification data cannot be retained in data lakes or data clean rooms for subsequent analysis.

The accuracy requirement is also notable. Operators must take "reasonable steps to determine that any product, service, method, or third party utilized for Age Verification Purposes is likely to provide reasonably accurate results." That standard invites scrutiny of the verification tools themselves - their false-positive rates, demographic biases, and technical methodologies. Vendors selling age-estimation or age-inference products into the US market will face pressure to document and demonstrate accuracy, and operators selecting those vendors will need due-diligence processes that go beyond standard procurement.

The ICO's fine of £247,590 against MediaLab for failing to implement any age-assurance measures on Imgur - announced just weeks before the FTC statement on February 5, 2026 - illustrates what regulators on both sides of the Atlantic consider the baseline expectation: that platforms accessible to children must have some mechanism in place. The FTC's statement sets a different kind of floor for the US market: not just that verification must happen, but that when it does, the data collected must be protected, limited, and deleted.

Timeline

  • 1998 - Congress enacts COPPA, directing the FTC to promulgate implementing regulations
  • November 3, 1999 - FTC issues the first Children's Online Privacy Protection Rule
  • January 17, 2013 - FTC substantively revises the COPPA Rule for the first time
  • January 11, 2024 - FTC publishes Notice of Proposed Rulemaking for COPPA Rule amendments, supporting development of age-verification mechanisms (PPC Land)
  • March 17, 2024 - IAB raises concerns about proposed COPPA changes, warning they could harm children's online access (PPC Land)
  • August 2024 - FTC sues TikTok for alleged COPPA violations involving data collection from children under 13
  • February 11, 2025 - European Data Protection Board adopts Statement 1/2025 establishing ten GDPR-compliant principles for age assurance systems
  • April 22, 2025 - FTC publishes comprehensive COPPA Rule amendments in the Federal Register; rule takes effect June 23, 2025 (PPC Land)
  • June 13, 2025 - Google's Global Director of Privacy Safety criticizes Meta's age-verification proposal as creating unnecessary risks for children (PPC Land)
  • July 10, 2025 - Bluesky announces age-verification implementation for UK users under the Online Safety Act (PPC Land)
  • July 14, 2025 - European Commission publishes technical specifications for an EU-wide age-verification system (PPC Land)
  • July 26, 2025 - X implements age-verification system behind premium paywall for UK and EU users (PPC Land)
  • July 25-27, 2025 - UK Online Safety Act enforcement triggers a 1,400% surge in VPN registrations (PPC Land)
  • August 15-18, 2025 - Google Search rolls out age-verification prompts to US users (PPC Land)
  • August 16, 2025 - NextDNS launches DNS-level bypass for age-verification systems (PPC Land)
  • August 22, 2025 - Bluesky blocks Mississippi users rather than comply with the Walker Montgomery Protecting Children Online Act (PPC Land)
  • September 2025 - FTC sues Apitor for COPPA violations involving third-party SDK data sharing from children's app (PPC Land)
  • February 5, 2026 - UK ICO fines MediaLab £247,590 for failing to implement age-assurance measures on Imgur (PPC Land)
  • February 10, 2026 - Germany's BVDW publishes position paper calling for risk-proportionate age verification and media literacy, arguing against blanket mandates (PPC Land)
  • February 25, 2026 - FTC publishes Enforcement Policy Statement Promoting the Adoption of Age-Verification Technology, pledging not to bring COPPA enforcement cases against operators collecting children's data solely for age-verification purposes, subject to strict conditions
  • April 22, 2026 - Compliance deadline for 2025 COPPA Rule amendments

Summary

Who: The Federal Trade Commission, acting as the primary US enforcement body for COPPA, issued the policy statement. It applies to operators of mixed-audience websites and general-audience sites or services that collect personal data from children as part of an age-verification process.

What: An enforcement policy statement declaring that the FTC will not bring COPPA enforcement actions against covered operators who collect children's personal data for age-verification purposes, provided those operators meet a set of strict conditions around data use, third-party disclosure, retention, notice, security, and tool accuracy.

When: The statement was published on February 25, 2026. It remains effective until the FTC publishes final rule amendments on age-verification in the Federal Register, or until the statement is otherwise withdrawn. A COPPA Rule review specifically addressing age verification is expected to begin in the coming months.

Where: The statement applies in the United States. It was issued by the Federal Trade Commission under the authority of COPPA, 15 U.S.C. 6501 et seq., and the FTC's COPPA Rule.

Why: Self-declaration age-gating has proven unreliable and easy to circumvent. The FTC wants to encourage adoption of more technically robust age-verification mechanisms that can determine a user's age more accurately. More accurate determinations allow operators to apply child-protection measures to the right users, protecting more children. At the same time, the verification process itself requires data collection that could technically trigger COPPA liability - a paradox the statement is designed to resolve, without creating a broad exemption that could weaken children's privacy protections elsewhere.

Share this article
The link has been copied!