YouTube's community team published a formal FAQ on April 22, 2026 - the same day the revised COPPA compliance deadline arrived - explaining precisely why every creator on the platform must classify their content as either designed for children or not. The post, authored by a YouTube community manager named Jean-Baptiste, appeared in the YouTube Help Community forum and covers seven questions ranging from the legal obligation itself to what happens when machine learning systems disagree with a creator's own classification.
The timing is not coincidental. April 22, 2026 was the compliance deadline set by the Federal Trade Commission for operators to implement the 2025 amendments to the Children's Online Privacy Protection Rule. Those amendments took effect on June 23, 2025, following publication in the Federal Register on April 22, 2025, and represented the most significant changes to US children's online privacy law in over a decade.
Why every creator is affected
The requirement is not limited to channels that produce cartoons or toy unboxings. According to the YouTube Help documentation included in the FAQ materials, creators who do not make videos for children must still confirm that their audience is adult. The platform's position is explicit: "Even creators who do not design videos for children must define their audience."
This universal obligation flows from YouTube's 2019 settlement with the FTC and the New York Attorney General, which required the platform to implement the audience classification tool inside YouTube Studio. The settlement addressed COPPA violations involving the collection of personal data from children without adequate parental consent. That enforcement moment reshaped how the platform handles children's content, and the April 2026 compliance deadline for the 2025 rule amendments marks the next formal checkpoint in that process.
According to the FTC's own COPPA compliance guidance, the rule applies to operators of commercial websites and online services directed to children under 13 that collect, use, or disclose personal information from children - and also applies to general-audience platforms with actual knowledge that they are collecting data from children under 13. Congress originally enacted COPPA in 1998. The Commission's original rule became effective on April 21, 2000, and it was substantially amended on January 17, 2013, with that amended version taking effect July 1, 2013.
Two ways to set the audience parameter
The YouTube Studio interface offers creators two distinct levels at which the audience setting can be applied. The first is at the channel level, which sets a uniform classification for all existing and future videos on that channel. According to the platform documentation, this option suits creators whose entire output follows a consistent line - either entirely for children or entirely not. The second option applies the setting video by video, which is appropriate when only some content on a channel targets younger viewers.
The audience-setting tool for third-party applications and the YouTube API Services was, as of the time the help page was last updated, still pending availability. In the interim, the platform advised creators to upload children's content directly through YouTube Studio.
What changes when content is classified as designed for children
The functional consequences of a "Concu pour les enfants" (designed for children) classification are substantial - and directly relevant to anyone buying or selling advertising on the platform.
Personalized advertising is disabled on content marked as designed for children. According to the platform documentation, "in compliance with the Children's Online Privacy Protection Act (COPPA) and/or other applicable laws, we do not display personalized ads on children's content." The platform acknowledged this may result in reduced revenue for creators producing this type of content. For YouTube's monetization ecosystem, which has paid creators, media companies, and music partners $70 billion over three years, the advertising revenue loss from children's content classification is a meaningful constraint. The YouTube Shopping affiliate program also excludes channels classified as Made for Kids, and channels where a significant share of content is so classified.
Interactive features are also affected. Comments, live chat, notifications, and the ability to add videos to playlists are all disabled or limited on children's content. The rationale is data minimisation: the platform limits the collection and use of data from children's content to comply with COPPA and other applicable laws.
Visibility and recommendation patterns change as well. Videos classified as designed for children become more likely to be recommended alongside other youth-oriented content, which may or may not serve a creator's strategic objectives depending on their target audience.
Machine learning as a secondary enforcement layer
YouTube does not rely solely on creator declarations. According to the platform's documentation, machine learning tools are deployed to identify videos that clearly target a young audience. The platform states it counts on creators to classify their content correctly, but reserves the right to modify audience settings in cases of error or misuse.
This automated layer has direct legal significance. If a creator intentionally misclassifies content - labeling a children's video as not designed for children in order to preserve access to personalized advertising - they face potential sanctions from YouTube as well as legal exposure under COPPA and from the FTC. According to the FTC's COPPA FAQ document, civil penalties can reach $53,088 per violation. In some enforcement actions the FTC has sought no civil penalty; in others, penalties have reached millions of dollars, with the exact amount depending on factors including the number of children affected, the type and volume of personal information collected, and whether it was shared with third parties.
Once YouTube's systems apply a "Defined as designed for children" label to a video, creators cannot modify the audience parameter themselves. They retain the right to appeal through YouTube Studio if they believe the automated classification is in error.
How the FTC defines child-directed content
The FTC's own compliance guidance, published as a detailed FAQ document for businesses and small entities, sets out multiple factors for determining whether an online service is directed to children. These include subject matter, visual content, the use of animated characters or child-oriented activities, music or other audio content, the age of models, the presence of celebrities who appeal to children, language characteristics, whether advertising on the service targets children, and empirical evidence about audience composition.
A video does not become child-directed simply because some children watch it. However, according to the FTC guidance, if a creator's intended audience includes children under 13 - even as only a portion of a larger intended audience - COPPA requirements apply when personal information is being collected from those users. Traditionally adult-subject-matter categories such as employment, personal finance, or politics are generally outside COPPA's scope. Content involving toys, dress-up games, or child-oriented activities is more likely to trigger classification as child-directed.
The FTC also distinguishes between "mixed audience" services - those directed to children where children are not the primary audience - and services primarily directed to children. For the former, a neutral age screen may be deployed. For the latter, the platform must treat all users as children.
How to change the setting on Android
The FAQ from YouTube's community team includes step-by-step instructions for modifying the audience parameter via the YouTube Studio Android application. According to the documentation, a creator opens the YouTube Studio app, navigates to the Contents tab, selects the relevant video, taps the Edit icon (represented by a pencil), taps Audience, then chooses between "Yes, it is designed for children" or "No, it is not designed for children." The final step is saving the selection. A parallel process exists through the main YouTube application for Android: from the profile photo, a creator navigates to "Your videos," selects the video using the More icon, taps Audience, makes the selection, and saves.
The enforcement context
The April 22, 2026 compliance deadline sits at the centre of a broader enforcement environment that has been tightening for years. The FTC gave age verification technology a conditional COPPA enforcement shield in a February 25, 2026 policy statement, offering mixed-audience and general-audience operators conditional protection when collecting data solely for age-verification purposes - but that relief does not extend to services primarily directed to children.
The Disney settlement announced in September 2025 illustrates the commercial and legal stakes. Disney agreed to pay $10 million after the FTC determined the company had systematically failed to designate child-directed content as "Made for Kids" on YouTube. The violations resulted from a corporate policy that applied channel-level designations rather than reviewing individual videos, causing child-directed content to appear on channels labeled for general audiences - and therefore eligible for targeted advertising it should not have carried. The case demonstrated that channel-level classification alone is insufficient when individual videos may target children regardless of the broader channel's orientation.
Earlier, the IAB had warned against FTC rule changes that it argued could harm children's online access by making compliance so complex that services would withdraw from children's audiences entirely. The trade body also raised concerns about proposals to treat screen names and avatars as inherently identifying information. The FTC pressed ahead with the 2025 amendments regardless.
The FTC's 2026-2030 Strategic Plan, published on April 3, 2026, identified children's online privacy as a sustained institutional priority through the end of the decade. The plan was approved by a 2-0 Commission vote. Its publication less than three weeks before the COPPA compliance deadline signals that enforcement attention in this area is not declining.
The broader regulatory picture extends beyond the United States. The ICO fined MediaLab - owner of Imgur - £247,590in February 2026 for failing to implement age checks on the platform between September 2021 and September 2025. UK law requires that services relying on consent as a lawful basis for processing children's data must obtain that consent from a parent or carer. MediaLab had no parental consent mechanism in place during the period examined.
Why this matters for the marketing community
The audience classification system sits at a fulcrum point between creator economics, advertiser access, and regulatory compliance. When a creator correctly labels content as designed for children, the content becomes ineligible for personalized advertising. That affects not only the creator's revenue but also the advertising inventory available to brands running campaigns on YouTube. Marketers planning buys across children-adjacent content categories - toys, family entertainment, educational services - must account for the fact that relevant inventory is either already classified Made for Kids, or is at risk of being reclassified by YouTube's machine learning systems if it appears to target children but has not been labeled as such.
The YouTube Kids platform, which marked its tenth anniversary in 2025, operates under stricter advertising rules than standard YouTube - all ads must receive pre-approval from YouTube's policy team, and multiple categories including food and beverages and beauty products are prohibited. The structural separation between YouTube Kids and the main platform is itself a product of the COPPA settlement, and the audience-classification requirement for creators on the main platform is the mechanism that prevents that separation from being circumvented.
For performance marketers and programmatic buyers, the audience classification data feeding into advertising systems determines which inventory is eligible for interest-based targeting and which is not. Misclassification - in either direction - corrupts those signals. A children's video incorrectly labeled as not for children exposes an advertiser to reputational and legal risk. A general-audience video incorrectly labeled as children's content removes it from the pool of targetable inventory, reducing its monetization value.
The thespend.net community covers the financial dimensions of digital advertising, and the children's content classification question is increasingly relevant there too. Revenue per thousand impressions on children's content - where only non-personalized ads run - is structurally lower than on general-audience content where full targeting is available. As COPPA enforcement raises the cost of misclassification, the incentive structure for some creators to mislabel content in order to preserve CPM rates becomes more visible, even as the legal risk of doing so increases.
Timeline
- 1998 - US Congress enacts the Children's Online Privacy Protection Act (COPPA)
- April 21, 2000 - FTC's original COPPA Rule becomes effective
- January 17, 2013 - FTC publishes amended COPPA Rule; amendments take effect July 1, 2013
- November 2019 - FTC publishes guidance to help YouTube creators determine whether content is designed for children
- 2019 - YouTube and Google reach settlement with FTC and New York Attorney General over COPPA violations; YouTube's audience classification tool introduced
- March 17, 2024 - IAB warns against proposed FTC COPPA rule changes
- January 2025 - Google consolidates advertising policies for children and teens, implementing comprehensive protections across YouTube, Google Display Ads, and Display & Video 360
- April 22, 2025 - FTC publishes comprehensive COPPA Rule amendments in the Federal Register
- June 23, 2025 - Revised COPPA rules take effect, introducing stricter consent requirements for third-party data sharing
- September 2, 2025 - Disney agrees to $10 million settlement for failing to properly classify children's content as Made for Kids on YouTube
- September 7, 2025 - FTC sues Apitor over COPPA violations involving geolocation data from children's robot toy app
- September 20, 2025 - YouTube Kids marks its 10-year milestone
- February 5, 2026 - ICO fines MediaLab £247,590 for failing to protect children on Imgur
- February 25, 2026 - FTC publishes enforcement policy statement granting conditional shield for age-verification technology operators
- April 3, 2026 - FTC publishes FY 2026-2030 Strategic Plan, naming children's online privacy a sustained institutional priority
- April 22, 2026 - COPPA 2025 amendment compliance deadline; YouTube community manager Jean-Baptiste publishes formal FAQ on audience classification requirements
Summary
Who: YouTube content creators worldwide, the platform's machine learning enforcement systems, the FTC, and advertisers running campaigns on YouTube inventory that may include or exclude children's content.
What: YouTube published a formal FAQ on April 22, 2026 clarifying the legal obligation for all creators to classify their content as either designed for children or not, explaining the technical consequences of that classification - including the loss of personalized advertising and the disabling of comments, live chat, and notifications - and describing how machine learning systems may override a creator's own setting when errors or misuse are detected.
When: The FAQ was published on April 22, 2026, the compliance deadline for the 2025 amendments to the FTC's COPPA Rule. The underlying legal framework dates to COPPA's enactment in 1998 and YouTube's 2019 FTC settlement, which first introduced the audience classification requirement.
Where: The FAQ was published in YouTube's Help Community forum and is tied to settings within YouTube Studio, accessible via desktop and mobile applications for Android and iOS.
Why: The audience classification requirement exists because YouTube operates under a 2019 settlement with the FTC and the New York Attorney General requiring compliance with COPPA. Under COPPA and applicable international laws, platforms must limit data collection from children's content, which means disabling personalized advertising and certain interactive features. Misclassification - intentional or accidental - exposes creators and platforms to FTC civil penalties that can reach $53,088 per violation, with total fines in major cases reaching millions of dollars.