The gap between websites gaining Google traffic and those losing it has widened sharply this year, and a new analysis published on April 9, 2026, puts concrete numbers to what separates them. Cyrus Shepard, founder of Zyppy SEO and a long-standing figure in the search optimization community, analyzed more than 400 winning and losing websites and identified five features that, taken together, predict Google traffic outcomes with striking consistency.
The study draws on many of the same sites covered by SEO analyst Lily Ray in her research on Google's December core update. Shepard examined traffic trends over a 12-month period, classified sites by business model, content types, and creator profiles, and then measured each feature's correlation with year-over-year traffic change using Spearman correlation coefficients. The results, published in his Substack newsletter Zyppy Signal, point to a clear structural shift in what Google's algorithm rewards.
The five features and their correlations
The five characteristics that most strongly predict whether a site wins or loses Google traffic are, in order: offering a product or service (Spearman correlation of 0.391); allowing task completion (0.381); owning proprietary assets (0.357); maintaining tight topical focus (0.250); and building a strong brand (0.206). None of these measures technical SEO prowess directly. All of them point toward what a website fundamentally does for its users and how difficult it is to replace.
1. Offering a product or service
According to the analysis, 70.2% of winning sites offer a product or service, compared to just 34.6% of losing sites. This single feature carries the highest correlation in the dataset. Sites that sold their own products performed especially well, as opposed to those relying on third-party platforms. Service-based offerings - subscriptions and digital goods among them - also outperformed informational or affiliate content.
Among the winning examples Shepard cited were budgetbytes.com, a recipe site that supplements its editorial content with a subscription meal plan service, and mathnasium.com, which offers both online and in-person tutoring. On the losing side sat byrdie.com, which operates primarily as a fashion publisher with no proprietary product, and medicalnewstoday.com, a large informational publisher that provides no service of its own.
The correlation is the highest of the five features, and Shepard notes it aligns with what Google's own guidelines suggest: that content should have "an existing or intended audience" that does not depend on search traffic to exist.
2. Allowing task completion
The second feature measures whether a website allows users to actually complete the task they came to perform. Sites where readers gather information but must go elsewhere to act on it are increasingly at a disadvantage. According to the data, 83.7% of winning sites allow task completion, compared to 50.2% of losing sites.
WalletHub is used in the analysis as a losing-side example. It produces high-quality credit card comparison pages, but the application process for those cards happens entirely off-site. The user's journey is incomplete. In contrast, stockanalysis.com functions as a complete research platform for stock analysis, mathisfun.com offers interactive tools and quizzes so users can actually practise mathematics, and powerball.com lets users check their lottery tickets directly from an authoritative source.
Task completion does not require selling anything. Tools, searchable databases, booking systems, and calculators can all satisfy this criterion without a transaction. The key is whether the site owns the next step of the user journey.
3. Proprietary assets
This feature examines whether a website owns something that other sites cannot easily replicate. According to Shepard's research, 92.9% of winning sites have proprietary assets, while only 57.1% of losing sites do. The gap here is the widest of any of the five features.
Proprietary assets can take several forms: unique products, specialized databases, user-generated content, software, original data, or large collections of reviews. Letterboxd, a fast-rising movie community platform, uses data from its user base to graph film popularity over time. Todaytix maintains a live inventory of theater ticket availability. These are assets that cannot simply be reproduced by a competing publisher writing about the same topics.
Losing examples include lifewire.com, which offers mostly tutorials and explainer-style content with few first-party assets, and thespruce.com, a popular home and lifestyle blog that lacks meaningful proprietary content. Both sites cover their topics competently. The issue, according to this analysis, is that competence alone is no longer sufficient.
4. Tight topical focus
The distinction between "topical focus" and "tight topical focus" turns out to matter a great deal. According to Shepard, when he analyzed broad topical focus initially, he found no meaningful difference between winners and losers. Only when he shifted to measuring tightness of focus did the pattern become clear: 75.9% of winning sites have tight topical focus, compared to 61.3% of losing sites.
minecraft.wiki is used as a winner example - it functions like Wikipedia but covers only Minecraft, with no dilution across unrelated subject areas. happiestbaby.com is laser-focused on babies. These contrast with businessinsider.com, which covers business alongside entertainment, culture, and parenting, and newsweek.com, which Shepard describes as "a perfect example of a broad publisher covering many verticals."
The correlation here (0.250) is lower than the first three features, but the pattern reinforces a broader point: Google appears to be tightening its definition of expertise, and breadth increasingly works against sites competing for organic visibility.
5. Strong brand
The brand feature was measured using Ahrefs data. Shepard examined each site's top 20 keywords and identified branded navigational terms - terms people use when looking specifically for that destination. Sites were scored as "strong brands" based on both the volume of branded navigational searches and the percentage of overall traffic those searches represented. A site could be widely recognised and still fail this test if almost none of its traffic arrived via branded queries.
According to the data, 32.6% of winning sites qualify as strong brands, compared to 16.1% of losing sites. This is the lowest raw penetration of any feature in the study, and the lowest correlation (0.206). But the direction is consistent with the others. zoom.com and skims.com appear as winning examples - both are destinations that users seek out directly rather than arriving through generic keyword searches. lifewire.com and techtarget.com are listed as losing examples despite being widely recognised, because most of their traffic arrives through longer-tail informational queries rather than brand-driven navigation.
The additive effect
Perhaps the most practically significant finding in the study is that these features are additive. A site displaying only one feature had a win rate of 15.4%, barely above the 13.5% win rate for sites with no qualifying features at all. Sites with two features achieved a 22.0% win rate. At three features, the win rate rose to 30.7%. At four features, it jumped to 68.1%, and at five features, it reached 69.7%.
The jump from three to four features is the largest single step in the data - more than doubling the win rate. This suggests that crossing a threshold of four concurrent features moves a site from the zone where losing is more likely than winning into the zone where winning is. No individual feature is sufficient on its own.
Features that did not correlate
Shepard also documented several characteristics that he expected to matter but found no meaningful correlation with winning or losing. These included demonstrating first-hand experience or personal perspectives, hosting user-generated content or community platforms, and uniqueness of information.
He was explicit about interpreting these non-findings carefully. According to his analysis, the absence of correlation does not mean Google ignores these qualities; rather, they may already be so thoroughly embedded in the algorithm that their effect is not measurable as a differentiator among the sites in this dataset. A larger study might find different patterns.
Context: what this means amid AI search disruption
The Shepard analysis arrives at a moment when the structural economics of web publishing are under significant pressure from Google's AI features. Google's AI Overviews now correlate with a 58% reduction in click-through rates for top-ranking pages, nearly doubling the 34.5% decline Ahrefs documented in April 2025. News publishers have lost roughly half their Google search traffic over two years, with Google Web Search dropping from 51% of referrals in 2023 to just 27% by the fourth quarter of 2025. Meanwhile, small publishers have experienced 60% declines in search traffic as AI reshapes the open web.
That backdrop makes the five features Shepard identified particularly relevant for marketing professionals. If organic traffic is compressing at the category level, the sites that survive are likely to be those that are harder for AI to substitute - sites with proprietary data, transactional functions, tight expertise, and brand destinations that users actively seek out.
Google's head of search Liz Reid acknowledged in October 2025 that "there are winners and losers" in any ranking update, while attributing part of the traffic shift to changing user preferences, particularly among younger audiences migrating toward short-form video and social platforms. SEO professionals have documented five years of increasing Google volatility, with confirmed update frequency declining from 10 annual announcements in 2021-2022 to approximately 4 in 2025, while perceived volatility has risen. In that environment, the Shepard study provides a rare set of measurable, data-backed signals.
Separately, Lily Ray, senior director of SEO at AMS Digital, warned in December 2025 that excessive optimization now triggers penalties in Google's system - a dynamic that reinforces Shepard's finding that the sites winning are not primarily those with the most aggressive SEO, but those with the most differentiated and functionally useful products and data.
The Zyppy Signal newsletter, through which the study was published, was launched approximately a month before the April 9 publication date, according to the newsletter footer. Shepard described the analysis as covering more than 400 sites, examining traffic trends over a 12-month window and classifying them across multiple dimensions. The full list with updated traffic statistics is available via his Substack.
For marketers and publishers assessing their own positions in organic search, the five features offer a framework grounded in measurable outcomes rather than optimization tactics. The data does not guarantee traffic recovery for any individual site. But across the 400-site dataset, the five-feature profile that Shepard describes - a product or service, task completion, proprietary assets, tight topical focus, and strong brand - is the clearest empirical picture yet of what Google's algorithm is rewarding in 2026.
Timeline
- August 2025: Google Discover becomes dominant traffic source for news publishers, accounting for two-thirds of Google referrals, according to PPC Land coverage
- October 18, 2025: Google's head of search Liz Reid discusses AI transformation and publisher traffic losses in Wall Street Journal interview
- October 28, 2025: Google search ranking volatility detected; Cyrus Shepard raises concerns about traffic impacts from new ad layouts
- November 4, 2025: Seer Interactive publishes study documenting 61% organic CTR decline for AI Overview queries, as covered by PPC Land
- December 15, 2025: Nick Fox, Google's SVP of Knowledge and Information, tells publishers AI search optimization is identical to traditional SEO
- December 23, 2025: NewzDash analysis confirms news publishers lost half their Google search traffic in two years
- December 26, 2025: Lily Ray warns excessive SEO optimization now triggers Google penalties, recommends entity-building and content quality
- January 2, 2026: SEO consultant documents five years of increasing Google update volatility, with confirmed update frequency declining
- February 4, 2026: Ahrefs publishes research showing AI Overviews now correlate with 58% reduction in click-through rates for top-ranking pages
- April 2026: Small publishers documented losing 60% of search traffic as AI reshapes the web
- April 9, 2026: Cyrus Shepard publishes "5 Data-Backed Features of Websites Winning Google in 2026" in Zyppy Signal, based on analysis of 400+ websites
Summary
Who: Cyrus Shepard, founder of Zyppy SEO and author of the Zyppy Signal newsletter, conducted the analysis. The research also draws on Lily Ray's prior work on Google's December core update, and references Ahrefs data for the brand measurement component.
What: An analysis of more than 400 websites - winners and losers in Google search - identifies five features that predict year-over-year traffic outcomes using Spearman correlation coefficients. The five features are: offering a product or service (0.391 correlation), allowing task completion (0.381), owning proprietary assets (0.357), maintaining tight topical focus (0.250), and building a strong brand (0.206). Sites with four or five of these features achieve win rates of 68.1% and 69.7% respectively, versus 13.5% for sites with none.
When: The study was published on April 9, 2026, in the Zyppy Signal newsletter on Substack. It analyzed 12 months of traffic data for the sites in the dataset.
Where: The research was published on Substack via the Zyppy Signal newsletter. The websites analyzed span multiple content categories including recipe sites, tutoring platforms, fashion publishers, informational publishers, gaming wikis, financial data platforms, and e-commerce destinations.
Why: The study was conducted to quantify what separates Google traffic winners from losers at a time when AI features are compressing organic traffic across entire content categories. For the marketing community, the findings provide a data-grounded framework for assessing which website characteristics are most likely to sustain organic visibility as Google continues integrating AI into its search results.