France's competition authority on February 18, 2026, published a landmark opinion exposing deep structural imbalances between the country's more than 150,000 professional video content creators and the handful of platforms that dominate their economic lives. The opinion, designated 26-A-02, follows an ex officio investigation opened by the Autorité de la concurrence on May 13, 2024, and represents the most comprehensive regulatory examination of the online video content creation sector in France to date.

The findings land at a moment when the creator economy is expanding rapidly across Europe. According to the Autorité's published press release, the sector "has grown rapidly over the past 15 years and is now an integral part of the French audiovisual industry." That growth has not, however, translated into negotiating strength for the creators who power it. Eight out of ten content creators who participated in the authority's survey confirmed their bargaining power with platforms is "weak or very weak." Even creators with large, established audiences told the authority's Board that the imbalance runs strongly in favour of platforms.

The four platforms at the centre

The opinion focuses on four major platforms - YouTube, TikTok, Instagram and, to a lesser extent, Twitch. These are the key intermediaries through which creators in France reach audiences, access monetisation mechanisms and build their professional identities. The Autorité is explicit that the sector is concentrated around these players and that this concentration, combined with significant barriers to entry for new competitors, gives each platform substantial market power over the creators who depend on them.

That dependence is structural, not incidental. According to the opinion, platforms are essential to creators' activity, and a small number of them often account for a very large share of any individual creator's revenue. The reverse is not true. A given content creator represents only a marginal share of a platform's total revenue - particularly when that creator has a moderate-sized audience. This asymmetry shapes every commercial interaction between the two parties.

One might expect creators to respond by migrating to alternative platforms when conditions become unfavourable. The Autorité examined this possibility carefully and concluded that switching is far harder in practice than in theory. According to the opinion, "substitutability between platforms presents significant constraints, due in particular to their formats, preferred themes, specific cultures and expertise and, lastly, the imperfect overlap of their audiences." A creator who has built a long-form educational following on YouTube cannot simply transplant that community to TikTok, which is optimised for short vertical video and carries a different cultural register entirely.

These constraints, the Autorité argues, mean that the main platforms do not belong to the same product and service market. They are more complementary than competing from the creator's perspective. The practical result is that many creators practice multihoming - distributing content across several platforms simultaneously to diversify both audience and revenue. But multihoming is a response to fragility, not evidence of bargaining strength.

Revenue sharing: platforms decide everything

The most technically consequential section of the opinion concerns how advertising revenue is shared. The Autorité identifies a fundamental legal vacuum: unlike related rights in the publishing sector, French law does not require platforms to propose any mechanism for sharing advertising revenue with creators. The decision to offer such a mechanism, and the terms on which it operates, rest entirely with each platform. Instagram, the opinion notes specifically, has not introduced an advertising revenue-sharing mechanism at all.

For platforms that do operate revenue-sharing programmes - YouTube, TikTok and Twitch among them - access is typically subject to minimum thresholds: a certain number of subscribers, a minimum volume of views, or a required frequency of uploads. According to the Autorité, these thresholds "de facto exclude a portion of video content creators in France, despite such creators also generating revenue for the platforms." A small creator whose videos attract advertising impressions and contribute to platform revenue may nonetheless receive nothing from that revenue if they fall below the access bar.

Those who do qualify face a further constraint. According to the opinion, "content creators have very limited, or even no, individual bargaining power over the monetisation of their content, preventing them from obtaining individual terms." Platforms set the rules universally, and the rules are not open to individual negotiation. This dynamic has parallels across the creator economy, where platforms from X to Spotify have repeatedly restructured payout formulas with no creator input, sometimes shifting the basis of calculation entirely without advance notice.

Compounding the problem is a lack of transparency in how the rules are actually applied. Platforms have sole control over the implementation of their revenue-sharing mechanisms, and creators have minimal access to the underlying data - including how views are counted and how the advertising revenue base is defined. According to the Autorité, "many content creators complain of a lack of visibility regarding future revenues." A creator who cannot understand how their payments are calculated cannot plan their business, negotiate sponsorships intelligently, or assess whether the platform's stated terms are actually being honoured.

YouTube's algorithm transparency has been a recurring concern, with creators in multiple markets documenting significant unexplained drops in views and earnings following undisclosed platform changes. The French authority's findings give formal regulatory weight to those complaints.

Algorithms and visibility: the hidden lever

Content visibility is, in many ways, more consequential than revenue-sharing rules because it determines whether creators can build the audience that makes monetisation possible in the first place. Visibility on platforms is primarily determined by recommendation algorithms and, to a lesser extent, by moderation measures - both of which are entirely under platform control. According to the Autorité, "creators have no real control over how their content is distributed or promoted."

The authority expresses particular concern about the competitive risks embedded in this arrangement. Platforms could, in principle, use their algorithmic control to maximise their own profitability in ways that harm creators. The opinion identifies two specific scenarios that it considers potentially problematic. First, a platform could reduce the prominence of content featuring a commercial partnership between a creator and an advertiser - pushing the advertiser toward buying direct advertising inventory instead of sponsoring creator content. Second, a platform could systematically promote content that is especially lucrative for the platform itself, including - in the longer term - content generated entirely in-house by generative AI.

Both scenarios would raise serious concerns under Article 102 of the Treaty on the Functioning of the European Union and Article L. 420-2 of the French Commercial Code. The Autorité stops short of finding that these practices are currently occurring, but it names them explicitly as conduct that would attract legal scrutiny. For advertisers and marketing professionals, the first scenario is particularly relevant: if platforms algorithmically suppress branded content to redirect advertiser spending toward paid placements, the value of creator partnerships in media planning becomes structurally distorted. French digital marketing generated €14.4 billion in revenue in 2024, with major platforms controlling approximately 70% of digital advertising distribution - a concentration that makes algorithmic fairness a significant commercial question.

The Autorité also draws attention to the practical difficulty creators face when something goes wrong. The authority calls on platforms to mobilise sufficient human and material resources so that creators, "whatever their profile, can reach platform representatives capable of explaining, for instance, a drop in content visibility, a moderation action sanctioning their content or a ban." This is a pointed observation. YouTube has faced repeated criticism for failing to communicate clearly when creators experience unexplained drops in viewership, and the French opinion effectively validates those concerns at a regulatory level.

Generative AI: a new competitive variable

The opinion also examines the emergence of generative AI as a factor in competitive dynamics between content creators. As AI tools enable the production of video content at scale and at low cost, the Autorité identifies a risk that AI-generated content could become a parameter of competition between different types of content without adequate disclosure to audiences. According to the opinion, "operators of generative AI systems and online platforms must ensure that content created by generative AI can be clearly identified." This recommendation extends beyond labelling for audiences - it carries implications for the competitive integrity of recommendation systems, given that platforms might have financial incentives to promote cheaper, algorithmically generated material at the expense of human creator output.

The concern is not purely hypothetical. YouTube secretly modified content without creator consent in an undisclosed experiment on Shorts in 2025, using machine learning to alter visual elements of creator videos. The incident illustrated precisely the kind of opacity that the Autorité's recommendations are designed to address. Thomas Höppner, a competition lawyer and partner at Geradin Partners, noted in a LinkedIn post published on the day of the opinion that these concerns "are not limited to dominant video-sharing platforms. They apply equally to any dominant platform dealing with dependent business users or content providers."

Höppner also raised a question with direct relevance to the broader digital advertising market: "What might this imply, for example, for the favourable positioning of AI Overviews in Google Search?" The query points toward a systemic issue that the Autorité's opinion touches on but does not resolve - namely, whether the same logic that makes algorithmic self-preferencing in video problematic applies equally to search platforms that promote their own AI-generated responses ahead of publisher content.

The seven recommendations

The Autorité issued seven formal recommendations. In summary, the authority: encourages creators to assert their legal rights and urges organisations such as the French Union of Influencers and Content Creators (UMICC) to continue training creators and providing resources including contract templates (recommendation 1); requires that AI-generated content be clearly identifiable (recommendation 2); calls on platforms to ensure revenue-sharing terms and their implementation are fair and transparent (recommendation 3); recommends greater transparency in recommendation algorithms, including when those algorithms are updated (recommendation 4); calls for transparency in content moderation (recommendation 5); urges platforms to provide adequate human support so creators can obtain explanations for changes in their visibility or moderation actions (recommendation 6); and warns that algorithmic strategies promoting content that is lucrative for platforms - including future AI-generated content - "could disrupt competition between content and be highly detrimental to the diversity of supply for consumers" (recommendation 7).

The opinion carries significant weight for the advertising industry because it directly addresses the commercial relationship between creators and platforms in the context of sponsored content and paid advertising inventory. If platforms algorithmically demote branded creator content to redirect advertiser spending toward platform-owned ad formats, that is a competition concern - not merely a creator welfare issue.

The Autorité's methodology was extensive. The Board heard representatives from Instagram, OpenAI, TikTok, Twitch and YouTube, as well as from UMICC and a range of prominent French creators including Dr Nozman, EnjoyPhoenix, Gaspard G, HugoDécrypte, Inoxtag, Maghla, McFly & Carlito, Squeezie and ZeratoR. The authority also conducted a survey of content creators and launched a public consultation of industry stakeholders, supplemented by questionnaires and hearings. Its findings on bargaining power - with 80 percent of surveyed creators reporting weak or very weak negotiating position - emerge from this broad empirical base, not from anecdote.

The opinion sits within a broader pattern of French and European regulatory action targeting platform power. France's competition authority fined Google €150 million in 2020 for adopting opaque and difficult-to-understand rules for the operation of Google Ads, finding that the company was not consistent in applying its policies. In 2024, the same authority fined Google a further €250 million for failing to comply with commitments made in a dispute over licensing agreements with French publishers. The creator economy opinion follows logically from this sequence: the authority is applying similar transparency and fairness principles to a sector where the same structural asymmetries - opaque rules, unilateral terms, no individual negotiation - have taken hold.

At the European level, the Digital Markets Act imposes self-preferencing prohibitions on designated gatekeepersincluding Alphabet, ByteDance, and Meta. The Autorité's opinion complements this framework by addressing, through national competition law, the dynamics that the DMA does not fully cover - particularly the position of content creators who are not themselves businesses seeking platform access, but individuals whose livelihoods depend on platform decisions.

For marketing professionals, the practical implications are immediate. Advertisers who invest in creator partnerships need to understand that the algorithmic environment in which that content competes is neither neutral nor transparent. Media experts have flagged influencer content as a brand safety concern, with 27 percent identifying it among top digital media challenges - yet the Autorité's findings suggest that the risks run in the other direction as well. Creators whose sponsored content may be algorithmically suppressed represent a risk not just to brand safety, but to the measured performance of influencer campaigns. If platforms can quietly reduce the reach of content featuring brand partnerships, the return on investment calculations that justify influencer marketing spending become unreliable.

Timeline

  • May 13, 2024 - Autorité de la concurrence opens ex officio inquiry into the competitive functioning of France's online video content creation sector
  • 2024 - More than 150,000 professional content creators active in France, according to the Autorité's findings
  • November 13, 2025 - BVDW publishes Germany's first influencer marketing landscape, noting the global creator economy is projected to grow from $191 billion in 2025 to $528 billion by 2030
  • December 4, 2025 - French digital marketing sector reports €14.4 billion in revenue and 310,000 jobs, with major platforms controlling approximately 70% of digital advertising distribution
  • January 17, 2026 - X doubles creator revenue pool and restructures payout mechanics, shifting calculations to Verified Home Timeline impressions only
  • February 18, 2026 - Autorité de la concurrence publishes opinion 26-A-02 on the online video content creation sector, finding 80% of creators have weak or very weak bargaining power and issuing seven recommendations on transparency and algorithm fairness

Summary

Who: The Autorité de la concurrence, France's national competition authority, conducted the investigation. The subjects are France's more than 150,000 professional video content creators and the platforms - primarily YouTube, TikTok, Instagram and Twitch - on which they depend. The authority's Board heard representatives from Instagram, OpenAI, TikTok, Twitch and YouTube, as well as from UMICC and prominent French creators including Squeezie, Inoxtag, HugoDécrypte and others.

What: The authority published opinion 26-A-02 finding that content creators in France are structurally dependent on a small number of platforms, that 80 percent have weak or very weak bargaining power, that platforms unilaterally set revenue-sharing terms and content visibility rules with minimal transparency, and that algorithmic control could be used in ways that raise serious competition concerns. The authority issued seven recommendations covering creator rights, AI content identification, revenue-sharing transparency, algorithm transparency, moderation transparency, creator support access, and competitive fairness in content promotion.

When: The inquiry was opened on May 13, 2024. The opinion was published on February 18, 2026.

Where: The opinion covers the online video content creation sector in France. The hearing process took place before the Board of the Autorité in Paris. The platforms implicated - YouTube, TikTok, Instagram and Twitch - are global companies with commercial policies made outside France.

Why: The Autorité identified competition challenges in a sector where a small number of platforms hold significant market power over a large and economically dependent workforce of creators. The investigation addressed concerns about fairness in revenue sharing, transparency of algorithms and moderation, and the competitive risks posed by platforms that could use algorithmic tools to redirect advertiser spending or promote their own content at the expense of independent creators. The findings carry implications for the broader digital advertising market at a time when creator spending reached $37 billion globally in 2025.

Share this article
The link has been copied!