House Republicans yesterday introduced legislation that would establish a single national framework for consumer data privacy in the United States, a move that would override all state-level privacy laws and affect how digital advertisers, data brokers, and programmatic platforms handle personal data.
The bill, titled the "Securing and Establishing Consumer Uniform Rights and Enforcement over Data Act" - or the SECURE Data Act - was introduced by Representative Joyce of Pennsylvania on April 21, 2026, during the 119th Congress, 2nd Session. It was referred to committee on that date. According to the bill text, the legislation creates a national framework for consumer privacy rights and the protection of personal data, with enforcement split between the Federal Trade Commission and state attorneys general.
The timing matters. For years, the advertising and marketing industries have operated across a patchwork of state-level privacy laws - California, Virginia, Colorado, and nearly two dozen others each carrying distinct definitions of "sale," consent requirements, and opt-out mechanisms. Companies have already been adapting their ad infrastructure to comply with multiple state regimes, a complexity that the SECURE Data Act would, if enacted, resolve through federal preemption. Whether that simplification is welcomed or resisted depends heavily on which stakeholders are asked.
What the bill actually does
At its core, the SECURE Data Act grants consumers five categories of rights with respect to any entity defined as a "controller" - a term borrowed directly from European data protection vocabulary and defined in the bill as "a person that, alone or jointly with others, determines the purpose and means of processing personal data."
According to the bill text, those rights are: the right to confirm whether a controller is processing personal data and access a copy; the right to correct inaccuracies; the right to delete personal data provided by or obtained about the consumer; the right to obtain a portable copy of previously provided data in a usable format; and the right to opt out of targeted advertising, the sale of personal data, and automated profiling that produces decisions with legal or similarly significant effects.
That last category is worth dwelling on. The bill defines targeted advertising with notable specificity: it means displaying an advertisement selected based on personal data obtained from a consumer's activities "over time and across nonaffiliated websites or online applications." Critically, the definition excludes contextual advertising - ads based on a current search query or a visit to a specific website - as well as ads directed in response to a consumer's own request for information. It also excludes processing personal data "solely for measuring or reporting advertising or content performance, reach, or frequency, including independent measurement." For programmatic advertising professionals, that carve-out matters: audience measurement and reach/frequency reporting are explicitly not treated as targeted advertising under this bill.
Consent requirements for sensitive data
The bill draws a sharp line around sensitive data, defined as personal data that discloses racial or ethnic origin, religious belief, mental or physical health diagnosis, sexual orientation, or citizenship and immigration status; genetic or biometric data processed for the purpose of uniquely identifying an individual; personal data collected from a child or teen; and precise geolocation data - defined as location identified within a radius of 1,750 feet using GPS or similar mechanisms.
For sensitive data, consent is required before processing. The bill defines consent as "a clear affirmative act that signifies the freely given, specific, informed, and unambiguous agreement by a consumer to process personal data." No pre-checked boxes. No inferred consent from continued use of a service. The standard mirrors the architecture of GDPR consent in several respects, though the bill does not adopt GDPR's full framework.
Children and teenagers receive distinct treatment. A "child" under the bill means anyone under 13 - consistent with the existing definition in the Children's Online Privacy Protection Act. A "teen" covers ages 13 through 15. For teens, a controller may not process sensitive data without verifiable parental consent. Only a parent may exercise consumer privacy rights on behalf of a child or teen, and the bill explicitly preserves COPPA's existing requirements rather than replacing them.
Controller obligations and data minimisation
Beyond consumer rights, the bill imposes a set of obligations on controllers that will be familiar to anyone who has navigated GDPR compliance. Data minimisation is explicitly required: controllers must limit collection to what is "adequate, relevant, and reasonably necessary" in relation to each disclosed purpose. Secondary uses of data - uses not reasonably necessary or compatible with the disclosed purpose - require separate consumer consent.
Controllers must provide a "reasonably accessible, clear, and meaningful" privacy notice before processing begins. That notice must disclose each category of personal data processed, each purpose for processing, how consumers can exercise their rights, each category of personal data shared with other controllers or governmental entities, and - notably for the ad tech ecosystem - whether any personal data is transferred to, processed in, stored in, or sold to what the bill terms a "covered nation." According to the bill definitions, a covered nation carries the meaning given in title 10, United States Code section 4872(f) - a reference to adversarial foreign states including China and Russia, as outlined by Luis Alberto Montezuma in a LinkedIn post sharing the bill text.
If a controller sells personal data or processes it for targeted advertising, the disclosure obligations become even more specific. The controller must clearly and conspicuously disclose that activity before any collection, and must explain how consumers can opt out. For automated decision-making, controllers relying on profiling to make decisions with legal or similarly significant effects - defined as decisions to deny healthcare services, housing rentals, or employment opportunities - must disclose that the decision will be made using automated means and provide an opt-out mechanism before any such decision is made.
Data brokers and the FTC registry
One of the more operationally concrete elements of the bill concerns data brokers. The SECURE Data Act defines a data broker as a controller that collects and processes personal data about consumers who are not customers, clients, or subscribers of that controller, and that derives 50 percent or more of annual gross revenue from the sale of that personal data.
Within 12 months of enactment, data brokers must register with the FTC and pay a registration fee. That registration must include the legal name of the data broker, contact and address information, a description of each category of personal data sold, a statement on whether the data broker implements a purchaser credentialing process, and details of any unauthorised access incidents reported to a federal or state authority during the prior year. Within 18 months of enactment, the FTC must establish a publicly searchable central registry of registered data brokers. Consumers could use that registry to learn how to exercise privacy rights against specific brokers.
The practical implication for the advertising industry is substantial. Many companies operating in the programmatic supply chain - matching platforms, audience extension networks, identity resolution vendors - potentially meet the bill's definition of a data broker depending on their revenue composition and data sourcing. The FTC's existing guidance on data practices has already signalled a broad and expansive view of what constitutes privacy-implicating conduct; the SECURE Data Act would give that approach legislative grounding.
Processor obligations
The bill distinguishes between controllers and processors - entities that process data on behalf of a controller. Processors must adhere to controller instructions and assist in meeting the Act's requirements. The contract between a controller and processor must govern data processing procedures and clearly set out instructions, the nature and purpose of processing, the type of personal data involved, the duration of processing, and the rights and obligations of both parties.
At a minimum, processor contracts must require that each person handling personal data is bound by a duty of confidentiality, that data is deleted or returned to the controller at the end of service provision unless retention is required by law, and that the processor cooperates with compliance assessments. If a processor engages a subcontractor, those same obligations must flow down through the subcontract. The rule of construction is explicit: nothing in the processor section relieves a controller or processor from liability for a processing role.
Preemption and the state law question
Section 15 of the bill states plainly that no state may "prescribe, maintain, or enforce any law, rule, regulation, requirement, standard, or other provision having the force and effect of law" relating to the provisions of the Act. This is total preemption - not a floor that states may build on, but a ceiling that replaces state regimes entirely.
That provision is almost certainly the most contested element of the bill. California's privacy enforcement has been among the most aggressive in the US, with the state securing a $1.55 million settlement with Healthline Media in July 2025 and a $1.4 million settlement with Jam City in November 2025. Total federal preemption would extinguish those state-level enforcement regimes and shift all authority to the FTC and state attorneys general acting under federal law.
Applicability thresholds and exemptions
The bill does not apply universally. According to the bill text, the Act applies to entities subject to the FTC Act or to common carriers subject to the Communications Act of 1934, and only where one of two thresholds is met: the entity collects and processes personal data of more than 200,000 consumers annually and has annual gross revenue of $25 million or more; or the entity collects and processes personal data of 100,000 or more consumers annually and derives 25 percent or more of annual gross revenue from the sale of that personal data.
Exemptions are numerous. Governmental entities are excluded. So are financial institutions subject to the Gramm-Leach-Bliley Act, HIPAA-covered entities, nonprofit organisations, institutions of higher education, and several other specified categories. Personal data in employment contexts is also excluded - a distinction that narrows the bill's scope considerably for HR technology and workforce analytics vendors.
Health data receives layered treatment. The bill excludes health records, HIPAA-protected data, substance use disorder information protected under the Public Health Service Act, and a detailed list of research-related data categories. Separately, consumer reporting agency data under the Fair Credit Reporting Act is excluded, as is data regulated under the Family Educational Rights and Privacy Act.
Enforcement mechanics
Violations of the SECURE Data Act are treated as violations of a regulation under section 18(a)(1)(B) of the FTC Act - specifically, the provision covering unfair or deceptive acts or practices. The FTC has full enforcement jurisdiction, including over common carriers that would normally fall outside FTC jurisdiction under the Communications Act.
State attorneys general may bring civil actions on behalf of state residents to enjoin violations, enforce compliance, or seek damages. Before filing, they must provide written notice to the FTC. If the FTC or the US Attorney General has already instituted a civil action against a defendant, no state attorney general may bring a parallel action against that same defendant during the pendency of the federal case.
A right to cure is built in. Before any enforcement action, the FTC or a state attorney general must provide written notice identifying the specific provision alleged to have been violated. The controller or processor then has 45 days to cure. If the violation is cured and a written statement provided that no further violation will occur, there is no violation under the Act - unless the same entity subsequently violates again.
Effective dates and implementation timeline
The general effective date is two years after enactment - giving companies substantial time to prepare systems, contracts, and notices. However, three sections take effect earlier: the consumer privacy rights provisions in Section 2, the data security requirements in Section 4, and the data broker provisions in Section 5 all take effect one year after enactment.
That accelerated timeline for data security requirements is notable. According to the bill, controllers must "establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data." A rebuttable presumption of compliance is available for controllers that adhere to an approved industry code of conduct or that demonstrate state-of-the-art security practices through third-party attestation.
What it means for digital advertising
The SECURE Data Act represents a significant potential simplification for digital advertisers and publishers. The current patchwork of state privacy laws has required major ad platforms to build increasingly complex consent management and restricted data processing infrastructure on a state-by-state basis, each with distinct opt-out signal requirements, different definitions of data sale, and varying enforcement regimes.
A single federal standard with uniform definitions - particularly the specific carve-outs for contextual advertising and independent measurement - would reduce compliance overhead. But it would also set a national floor for opt-out rights that, under current state law, only a subset of Americans enjoy. That floor includes the right to opt out of cross-site targeted advertising across the entire US population, not merely in states that have enacted comprehensive privacy legislation.
For companies currently operating across the state patchwork, the critical question is how the federal standard compares to their most stringent existing obligations. Where CCPA, CPRA, or Virginia's CDPA impose stricter requirements than the SECURE Data Act, federal preemption could reduce compliance burdens. Where the federal standard is more demanding - particularly for smaller companies currently below state thresholds - it could create new obligations.
The bill's treatment of automated decision-making is also worth noting for ad tech. The definition of profiling - "any form of processing that is solely automated and performed on personal data to evaluate, analyze, or predict personal aspects of the economic situation, health, personal preference, interest, reliability, behavior, location, or movement" - is broad. But the opt-out right only applies when such profiling drives decisions with a legal or similarly significant effect - denying healthcare, housing, or employment. Pure advertising profiling, which does not determine access to those services, does not appear to trigger the opt-out right on its own.
The bill's introduction comes as the ad industry continues to adapt to privacy-driven structural changes. The FTC's scrutiny of data practices has been intensifying across multiple fronts, including guidance on hashed data, data clean rooms, and children's advertising. The SECURE Data Act would establish a statutory framework around many of those enforcement positions.
Timeline
- December 2023 - FTC proposes comprehensive updates to the COPPA Rule, signalling the direction of federal children's privacy policy
- March 2024 - IAB raises concerns about FTC COPPA proposals, foreshadowing industry response to federal privacy legislation
- July 2024 - FTC warns that hashed data is not anonymous, expanding regulatory pressure on ad tech data practices
- November 2024 - FTC warns that data clean rooms are not a privacy silver bullet, reinforcing that technology does not override legal obligations
- April 22, 2025 - FTC publishes final COPPA rule amendments in the Federal Register; rule takes effect June 23, 2025
- June 30, 2025 - Google expands privacy controls to eight additional US states for Universal Opt-Out Mechanism compliance
- November 2025 - Google adds Restricted Data Processing mode for Indiana, Kentucky, and Rhode Island effective January 1, 2026
- December 2025 - California privacy law updates take effect January 1, 2026, including new browser opt-out signal requirements from 2027
- March 2026 - FTC issues COPPA age-verification enforcement policy statement, granting conditional compliance shield for age-verification tools
- April 21, 2026 - Representative Joyce of Pennsylvania introduces the SECURE Data Act in the House of Representatives; bill referred to committee
Summary
Who: Representative Joyce of Pennsylvania, on behalf of House Republicans, introduced the SECURE Data Act during the 119th Congress, 2nd Session. The bill affects any controller or processor operating in the United States that meets the applicable revenue and data volume thresholds.
What: The SECURE Data Act proposes a national consumer data privacy framework granting Americans rights to access, correct, delete, and port their personal data, and to opt out of targeted advertising and data sales. It imposes data minimisation, consent, security, and transparency obligations on controllers, creates a registration and public registry system for data brokers administered by the FTC, and federally preempts all state privacy laws.
When: The bill was introduced and referred to committee on April 21, 2026. If enacted, most provisions would take effect two years after enactment; consumer rights, data security, and data broker requirements would take effect one year after enactment.
Where: The legislation was introduced in the US House of Representatives and, if enacted, would apply to covered entities conducting business in the United States or processing personal data of US residents.
Why: According to the bill's stated goals, the legislation seeks to replace a fragmented state-by-state privacy landscape with a single federal standard, establish enforceable rights for consumers across all US states, impose direct obligations on data brokers, and position the Secretary of Commerce as the principal US advisor on cross-border data flows and the protection of personal data in international commerce.