In 2017, I published an article titled How to rock SEO in a machine learning world in which I describe the concept of "holistic content" amongst others. The point I made evolved into my view of "topics," as I describe it in A better approach to keyword research for content marketing and Semantic content optimization with entities: we shouldn't care as much about keywords and instead focus on topics. Yet, here we are years later, still tracking keywords.
Topics vs. User Intent
An easy mistake is thinking the job is done when a page ranks in the top 3 positions for an important keyword. That's when it begins! Pages can rank for many keywords, not because they cover a certain topic "holistically" (think: fully), but because they address many user intents or many variations of a query that expresses the same intent.
Looking back, I realize what I actually meant with topics was User Intent. Pages don't rank for thousands of keywords because they cover a topic in its entirety but because they solve many searcher problems.
A good example are the top 10 results for the query "dog breeds":
https://dogtime.com/dog-breeds/profiles ranks on position #1, https://www.petfinder.com/dog-breeds/ on #3. Dogtime ranks for 2,100 keywords, Petfinder for 6,400! Dogtime's URL gets more traffic (766,000 monthly visits, according to SEMrush), but Petfinder has the higher potential.
What sticks out is how many different keywords these two URLs rank for.
You certainly wouldn't factor so much traffic potential in when doing keyword research the old way, the "one keyword per URL" way. The query "dog breeds" has a search volume of 673,000 in the US. Take 30% click potential for position one (which you won't get because Google shows a carousel at the top of the SERPs), you come out at roughly 200,000 monthly organic sessions.
But the traffic potential is so much higher, as we've seen at the hand of the two examples. The actual potential is the sum of all keyword variations that meet the user intents a URL can satisfy. SEMrush lists the total volume of all keyword variations: 3.3M. Take 30% of that volume and you come out at 990,000 monthly organic sessions - pretty close to what Dogtime's URL seems to get.
That's what I observed years ago and thought "topics" was the right way to think about it. A URL can rank for a topic that covers many queries. But it was User Intent all along.
As a second example, consider the keywords rankings in the screenshot below. Each keyword variation is a different way to express the intent. Many variations, one problem. That is how humans search.
One single article ranks for the keywords in the screenshot above and drives over 10,000 organic clicks per month. The search volume for the main keyword, "foods people hate," is 1,600 (US). Yet, Google understands that all queries relate to the same intent:
- common foods people don't like
- unpopular foods
- disliked foods
- hated foods
- top hated foods
Mind you that a URL doesn't (always) need text to rank for thousands of keywords. The two dog breed pages I showed above both don't have a single line of text. They're merely lists of dog breeds, but Google understands that this is exactly what users want.
The classic way to go about keyword research doesn't factor all these variations in and leaves a lot of traffic potential on the table as a result. Of course, we can try to research all variations before creating the content, but that quickly leads to huge spreadsheets and chaos. At the same time, it's hard to make sure your content is optimized for all relevant variations, even with great tools like Surfer, Frase, or Clearscope.
There is an easier process I call "content tuning." The idea is simple: create, optimize, publish, fine-tune. Identify top keywords to go after and create "optimized" content for them. But instead of "set it and forget it," give Google some time to understand how the content fits and optimize it for related variations.
It doesn't have to be complicated. It's enough to
- Open Google Search Console
- Filter for the last 28 days
- Select a page
- Look at queries on positions 3-20
- Expand content on that page for these queries
You can scale this process with the GSC API or use SEMrush's new intent feature.
When tuning content, compare your content with the top results. Reverse engineer how they satisfy the user intent for the variation. Often, it's just a definition or explanation, but it makes a big difference.
I prefer positions 3-20 or even just 11-20 because when Google ranks you on page 2 or the bottom results on page 1, you have a legit chance to qualify for the top spots. Sometimes, it just needs a better title, a section in the content, or a stronger link profile.
Look at the query "power law vs exponential" in the screenshot above. Google tries to rank my essay on position 18.6, but I don't explain the difference between power laws and exponential functions in it. I provide a lot of context around that topic but don't give a specific answer to this exact question. A little tweak should bump my article up to page 1.
Queries are just an abstraction of user intent. Users have different ways to express the same problem they're trying to solve. As a result, pages can rank for thousands of related keywords and variations. By focusing too much on a single keyword per page, we risk missing big traffic opportunities. However, keyword variations are not bound by topic but by intent.
Update: I wrote a manual for content tweaking with the GSC API