Internal links are the most powerful SEO lever you control 100% after content. That’s why I created a guide that explains every little aspect of it: best practices, strategies, and axioms. After finished, you’ll be able to optimize your internal linking structure and drive more organic traffic as a result.
This guide is split into two parts: a beginner part that explains internal linking basics [jump] and an advanced internal linking guide [jump]. The advanced part covers axioms, strategies, and best practices.
You probably wonder what axioms are and what they’re doing in an internal linking guide. Well, they’re the most important part of this guide. Axioms are commonly held and established beliefs. SEO changes constantly and fast. If we want to become better SEOs (and humans) we need to challenge our beliefs. We need to question axioms.
The image you see above is an art project from Chris Harrison and Christoph Römhild (link). It visualizes the cross-references between Bible verses. We can imagine internal links the same way.
Internal linking is the practice of adding hyperlinks between pages on the same site.
A link (whether external or internal) looks like this in HTML:
<a href="www.domain.com">anchor text</a>
They’re called “internal links” because they point at a page on the same domain.
The purpose of internal links is to guide visitors through a site and help them discover what they’re looking for. But they’re also immensely helpful for search engines to crawl and discover the web.
Internal links point at pages on the same domain they’re linking from while external links point at a page on another domain.
The difference between internal and external links is straightforward but we often forget that they’re deeply interconnected. As I wrote in Internal Link Optimization with TIPR:
We know PageRank is one of the biggest ranking factors in SEO, but we often forget that it flows between and within websites! The common practice is to optimize internal linking without taking backlinks into account. That, however, can lead to making the wrong decisions.
We distinguish between outgoing and incoming links.
|Internal outgoing link||The link points from the page you’re looking at to another page on the same domain|
|Internal incoming link||The link points from another page on the same domain to the page you’re looking at|
|External outgoing link||The link points from the page you’re looking at to a page on another domain|
|External incoming link||The link points from another page on another domain to the page you’re looking at|
Internal links distribute PageRank just as much as external links.
What is PageRank?
PageRank is an algorithm named after Larry Page that assesses the importance of a web page based on the relevance and authority of the links pointing at it from other sites.
Consider backlinks (links from other sites) a “vote”. The more votes a page has, the higher it tends to appear in the search results. Mind you that Google takes many signals into account when it comes to ranking documents and backlinks are just one of them.
Google is not using the original PageRank formula anymore, even though PageRank seems to be used in ranking to some degree. There used to be a browser toolbar that would tell you the PR of every page you were on but it was demoted in 2013. In other words: we don’t know the actual PageRank of a page anymore. But we can rely on 3rd-party tools to get an understanding of something that resembles PageRank. Those proprietary metrics are called Domain Authority (from Moz), Page Strength (from SEMrush), or Domain Rating (from Ahrefs).
You could say the concept of PageRank made Google what it is today: the most successful startup in history. Over the years, Google registered many patents that tell the story of the PageRank evolution.
What is CheiRank?
CheiRank measures the importance of a page based on its outgoing links. It is an inverse PageRank: it’s a concept that describes how communicative a node is within a corpus. Let me translate into English. If the idea of PageRank is to measure the “strength” of incoming links to a page, the idea of CheiRank is to measure the strength of outgoing links from a page.
Notice that the importance of CheiRank is relative. It’s most likely measure in comparison to all pages of a domain. You can use it to determine important hub pages, for example. This means you could equate “communicativeness” with the number of outgoing links of a page.
Optimizing CheiRank can be tricky because there is no perfect value. There is just an optimal value for a certain context. I tried to outline that context as good as possible in The best internal linking structure depends on your business model.
To give you a taste of what this metric looks like, check out the screenshot above. This is a snippet of an analysis I did years ago when I still worked at Atlassian that looked at PageRank and CheiRank with Audisto’s crawler. I present more details in my article about TIPR (linked in the section about differences between internal and external links) but the lessons from assessing CheiRank helped us optimize the marketplace and drive traffic up by a lot.
There are two types of internal links: content and module links. A content link is simply a link within the body content of a page (see the yellow field in the graphic below).
Module links exist in a variety of navigation elements like top header navigation, footer, or related articles (orange fields in the graphic above). Linking out of modules allows for a much more systematic adjustment of your internal link graph. It’s much more scalable than linking out of body content.
Besides being helpful to human users, internal links distribute PageRank throughout a site and help search engine crawlers find all relevant pages of a domain. In theory, the more PageRank a web page receives the better it ranks in search.
PageRank is just one of many ranking signals but the goal of internal link optimization is to drive more organic traffic to your site.
It’s not live anymore but the Search Console documentation mentioned that internal links are a signal for the relative importance of a page to Google (this article still has a screenshot).
Matt Cutts confirmed this in an interview with Stone Temple (now Perficient):
“There is also not a hard limit on our crawl. The best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, we’ll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and we’ll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.”Matt Cutts
To optimize an internal link graph, find broken links, and regularly audit your site, you can use either specialized crawlers or some rank trackers:
Rank trackers with crawling capabilities:
Enterprise SEO tools for internal link optimization:
SEO is a unique field of imperfect information and rapid change. If we don’t constantly challenge our beliefs, we fall behind. That’s why I built this guide around axioms.
I will list evidence (mostly from Google statements) for and against each axiom to either confirm the belief or bust the myth. Nothing is black or white and many things depend on context. I’ll also add my personal experience, which you should take it with a huge grain of salt.
This axiom could also be understood as “too many outgoing links decrease the chance of a web page ranking high for a target keyword”. There is some residue confusion in the SEO space because Google changed its stance about this over the years.
True: The webmaster guidelines recommend to “Limit the number of links on a page to a reasonable number (a few thousand at most).“
False: an old Google blog post from 2008 says “Including too many links on one page confuses visitors (we usually encourage webmasters to not have much more than 100 links per page).” However, you will notice that the 100 links point was removed from the article it links to.
Matt Cutts confirmed the old recommendation being outdated in a video from 2013 and that it’s common for some pages to have 200,300, 400, or more links as long as they’re valuable.
False: John Mueller mentioned that outgoing links diluting PageRank is not a thing.
False: John also mentioned that too many links is less of a problem than not enough high-quality content.
Personal experience: In 2013, I worked with a big public company on getting them out of a Panda filter. They had thousands of pages that themselves 2,000+ outgoing links and no content on them. Cutting those hubpages down and adding some content turned the site around got it back to baseline (it lost ~70% of traffic). Hearing John’s statement, it make sense why adding content and trimming links worked.
I think this axiom was born from the idea of bad/good neighborhood sites. The thought is that linking to and being linked from low-quality/spammy/penalized sites puts you in a “bad neighborhood”.
Pro: There is an older post from 2008 on Google’s blog stating that outbound links can hint at your credibility but be careful because that’s the same article in which Google states that > 100 outgoing links can be problematic. So, this is probably outdated.
The same article points out that “[…] from a search engine perspective, comment spam can connect your site with bad neighborhoods instead of legitimate resources.“
Personal experience: I’ve never seen anyone test this and I never noticed a ranking increase from linking out to more (authoritative) sources. That being said, just linking to an authoritative source for linking’s sake doesn’t make sense to me and I wouldn’t recommend it. I do very much promote linking to other sites when it’s useful, though.
Axiom: “You should use the internal anchor text of the keyword you want to rank for”
Anchor is a strong link signal for search engines to understand what a page is about. The idea behind this axiom is to signal higher relevance for a keyword to Google by using the same anchor text when linking at a page over and over again.
True: Google clearly recommends to “Choose descriptive text” in its SEO starter guide and says “Think about anchor text for internal links too”.
Link text is the visible text inside a link. This text tells users and Google something about the page you’re linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you’re linking to is about.Google SEO starter guide
True: On Twitter, John Mueller confirms that anchor text provides context.
True: a patent called “topic-sensitive PageRank” (linked in the patents section) took PageRank from a simple numbers game to a focus on relevance. Even though a patent registered by Google doesn’t have to mean it’s actually in use, it’s highly likely that this was embodied in the evaluation of links as Google reps so often allude the importance of relevance.
True: A series of patents about the concept of “phrase-based indexing” describes how relevance could be determined by whether anchor text appears on the linked-to page. Bill Slawski describes it best:
Anchor text in links pointing to a page that includes the phrase or a related phrase (one that tends to co-occur on pages that rank for that phrase) should be given more weight than anchor text that doesn’t.SEO by the Sea
Personal experience: In my experience, the relevance of external links has become much more important over time than just the number: quality > quantity. When it comes to internal links, in-body links seem to move the needle a bit more than module links.
There are two different opinions about internal anchor text: either vary it or keep it consistent. My personal stance is to drive a barbell strategy: most internal anchor text should be the keyword you want to rank for, some should be varied. Don’t overthink.
True: In an article from 2008, Google recommends to use descriptive anchor text.
“Writing descriptive anchor text, the clickable words in a link, is a useful signal to help search engines and users alike to better understand your content. The more Google knows about your site—through your content, page titles, anchor text, etc.—the more relevant results we can return for users (and your potential search visitors).“Google blog
Personal experience: I couldn’t really find any evidence against this and I think because it only happens by accident to link to the same target several times with different anchor text. My recommendation is to regularly audit your internal anchor text and see if you can make small changes to streamline internal linking but not to obsess over it.
True: Mueller confirmed in a video that Google “looks” more at the main content than, for example, the footer.
So with this specific situation – usually what happens here is we do focus on the primary content on the page. And that’s something that make sense from a user point of view.John Mueller
True: There are many patents that show how Google might distinguish between main content, footer, and other content blocks (example).
True: In 2016, Mueller confirms that Google evaluates links in the main content versus footer differently.
So I think there are two aspects there, on the one hand, this is the area of the page where you have your primary content, the content that this page is actually about, not the menu, the sidebar, the footer, the header… Then that is something that we do take into account and we do try to use those links.
True: Another Google patent that was granted in 2010 describes taking user behavior into account when assigning link weights. In other words, Google might evaluate how likely users are to click on links and giving those more weight.
Personal experience: While I agree that the value from a footer link is lower than from the main body, I think footer links still have value for crawling. They may help search engines find pages better and technically lower the click depth to important pages. To me, they’re not useless at all.
Axiom: “You shouldn’t use internal nofollow”
Internal nofollow tags were originally used for “PageRank sculpting”, an outdated method that prevents the flow of PageRank to unimportant pages. Ever since it’s been used to prevent Google from crawling certain pages when methods like robots.txt fail. However, some say that nofollow is a signal of distrust, and using it internally doesn’t make sense.
In 2019, Google announced to take nofollow as a signal instead of a directive and introduced 2 new attributes: rel=sponsored and rel=ugc. With that, Google started to count nofollow as link signal.
True: Matt Cutts confirmed (in 2009) that PageRank sculpting doesn’t work:
“Nofollow is method (introduced in 2005 and supported by multiple search engines) to annotate a link to tell search engines “I can’t or don’t want to vouch for this link.” In Google, nofollow links don’t pass PageRank and don’t pass anchor text”Search Engine Land
True: Pages that have internal links marked “nofollow” can still be crawled if other links without nofollow point at them. What crawl budget means for Googlebot: “Any URL that is crawled affects crawl budget, so even if your page marks a URL as nofollow it can still be crawled if another page on your site, or any page on the web, doesn’t label the link as nofollow.“
Personal experience: I’ve never been a fan of internal nofollow links. That being said, I recently removed a lot of nofollow links from a large domain and didn’t notice a big impact outside of more crawled pages.
Axiom: “Click depth should not be higher than 3”
Click depth or “crawl depth” is the number of clicks it takes to get from a homepage to your most important pages.
False: John Mueller did confirm that click depth is important in the video above but doesn’t mention a specific number.
Personal experience: I learned that you should go for a site structure that allows your click depth to be as low as possible without sacrificing user experience, for example by stuffing your homepage full with links. You want to link to your most important pages from the homepage but it’s okay if some links take 8 or 10 clicks to get there. Everything over 10 clicks from the homepage makes me suspicious.
True: Google’s “how search works” documentation has a couple of crawl budget references.
True: Matt Cutts confirmed this in an interview with Stone Temple (now Perficient).
“There is also not a hard limit on our crawl. The best way to think about it is that the number of pages that we crawl is roughly proportional to your PageRank. So if you have a lot of incoming links on your root page, we’ll definitely crawl that. Then your root page may link to other pages, and those will get PageRank and we’ll crawl those as well. As you get deeper and deeper in your site, however, PageRank tends to decline.”
Personal experience: Internal links help Google a great deal when assessing which pages to crawl and how often. Log files will show you how efficient your internal linking structure is and setting more internal links to pages that are crawled less often and rank worse than others that are crawled more often can have a big impact on traffic.
False: Google’s John Mueller denied this myth (see below).
True: Matt Cutts made a video about PageRank being divided by the number of links in 2013.
False: In an interview from 2010 with Stone Temple (referenced above), Eric Enge asked Matt Cutts “Let’s say you have a page that has ten other pages it links to. If three of those pages are actually duplicates which get discarded, have you then wasted three of your votes?“. His response: “Well, not necessarily. That’s the sort of thing where people can run experiments. What we try to do is merge pages, rather than dropping them completely. If you link to three pages that are duplicates, a search engine might be able to realize that those three pages are duplicates and transfer the incoming link juice to those merged pages.“
Personal experience: I wouldn’t worry too much about splitting PageRank up. What often happens is that you link to a page in the top navigation and in the body content or footer. I don’t think it’s a problem as long as it doesn’t happen too much.
True: Google states in a blog post that crawl budget can suffer when linking to low-quality content. Low-quality content is usually defined as duplicate, thin, or spammy content.
Personal experience: While I think linking to low-quality pages is not a good idea, the problem is having low-quality content in the first place! I would try to fix that, first.
Axiom: “Use rel=next/prev for paginations”
Rel=next/prev was a staple in SEO until Google announced that they don’t use it anymore in 2019.
True: Ilya Grigorik from Google mentioned that Googlebot is smart enough to figure it out and that there might be other good reasons to implement rel=prev/next (it’s still a W3C standard).
Personal experience: I think Google leaves us a bit hanging here. I see paginated pages all over the index but no good solution. Google recommends “view all pages”, infinite scroll, or just “letting Google figure it out”. Both are wonky options.
Axiom: “Use HTML sitemaps to optimize internal linking structure”
True: Google stated in 2010 that “59% of our submissions did not have a user-viewable site map. By providing one, you display the structure of your site and give the user easy one-click navigation. If users are having trouble finding specific pages on your site, a site map can help them find their way. Don’t send your users into the wild without a map!“
True: In Google’s SEO guide, they recommend to “Include a simple navigational page for your entire site (or the most important pages, if you have hundreds or thousands) for users.“
Personal experience: I’m a big fan of HTML sitemaps. They don’t double organic traffic overnight but can really help search engines to discover all pages on a domain and reduce click-depth.
Axiom: “Don’t use anchor text like ‘click here’ for internal linking”
True: Google’s SEO starter guide says “You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and Google navigate your site better.“
Personal experience: I’m a fan of making anchor text very specific and pick a good anchor text. In aggregate, it can move the needle.
Let’s dive into some best practices for internal link optimization.
Broken links are bad experiences for users and search engines. In a nutshell, you should avoid linking to any page with a status code that’s not = 200.
Google recommends that directly in their SEO Guide: “Avoid letting your navigational page become out of date with broken links.”
My suggestion: It’s hard to prevent broken links for large sites but it’s worth cleaning up. You can find broken links with the tools I mention in the tool section.
A healthy site structure has as few non-200 status code links as possible (see example in Oncrawl below).
Optimize internal anchor text
I recommend two fixes:
- making sure you don’t use “click here”, “here”, or similar anchor text
- optimizing most anchor text to be the keyword the linked target wants to rank for
I personally try to go for an 80/20 distribution for hard keyword anchor text (80%) vs variety (20%) but wouldn’t make this your biggest SEO bet.
My suggestion: Regularly (every 3 months) audit your internal anchor text with Screaming Frog and look for too much variety and too little consistency. I go with an 80/20 barbell approach of focus vs. variety.
If you want to take it to the next level, match the anchor text of your external backlinks with the one from your internal links. This will increase your potential to rank for the target keyword by a lot.
Look at the screenshots of Ahrefs backlink and internal links section below: in both cases, I looked for “tech bound” (my newsletter you should subscribe to) so I can compare whether I’m using the same internal anchor text as I get from external links.
When you crawl your site and see that a large amount of pages has only one incoming link, it’s usually a bad sign. Why? Because either the pages are poorly linked and thus get very little PageRank, or they are of low-quality and you try to link to them as little as possible.
My suggestion: In Google’s Search Console, you can navigate to “Links” > “Top linked pages – internally” and sort the list ascending to see which pages on your site get the fewest internal links. It might be time to link them a bit more!
On top of more internal links, those pages might also need a content update. That’s especially the case for large sites. Typical candidates are user profiles, old products, or outdated product feature pages. If they’re not needed anymore, give them a 410 status code and remove the internal links.
Use breadcrumb navigations
Breadcrumb navigations are very common these days, so most sites have them anyways. Not only do they create nice rich snippets in the search results, they can also be an important part of the crawlability and internal linking structure of a site.
Google’s SEO starter guide explicitly mentions breadcrumbs (see below) as best practice.
My suggestion: Make sure you have every step of your site taxonomy reflected in the breadcrumb and apply structured data for rich snippets.
Don’t use trails like
home > products > chair
Instead, go for something more descriptive like
home > furniture > chair
home > furniture > kitchen > chair
The goal of internal linking strategies is to improve the link graph of a site according to the type of site.
In The best internal linking structure depends on your business model, site structures can follow a centralized or decentralized model depending on whether visitors can convert on every or a few pages. Centralized sites are usually very content-heavy, think SaaS, agencies, or closed marketplaces. Decentralized sites (public marketplaces, consumer sites, social networks) scale with their inventory.
For inventory-driven or decentralized sites, the goal is to provide as many relevant connections between products and categories as possible to increase sales and discoverability.
Even though I highlight many examples from the e-commerce segment, most of these are very transferrable to other business models as well. Steal like an artist.
More from the same vendor
Horizontal linking taxonomies
Most e-commerce sites have vertical taxonomies by default: product categories, sub-categories, and products. Horizontal taxonomies are brands, locations, occasions, personas, etc.
an example of a horizontal taxonomy is Best buy’s Samsung category, which links to several subcategories within Samsung.
Another example of a horizontal linking taxonomy can be found on Target’s homepage (notice the “Halloween” and “Ready for school” categories).
Frequently bought together
Product bundles are some of the bst (maybe THE best?) way to semantically connect two products that have a strong relationship but live in different categories.
Topic siloing: The idea of topic siloing is to link articles that revolve around the same topic within each other and less to articles within another topic. You can do the same with product within a category. This would ensure that anchor text stays relevant and articles are topically within close proximity.
An evolution of topic siloing is topic clusters. The concept is that a long-form article (pillar) covers a shorthead keyword and many shorter articles (clusters) cover long-form keywords related to that shorthead keyword and link to the pillar. This creates close topical relationships within the link structure.
Related articles: Linking to related articles at the bottom of a page is table stakes and should be part of any site. Yet, some still forget it. Don’t make that mistake! Link to related sites and make sure the sensitivity is set to high (in plugins with that option) so you only link to really relevant articles.
And this is it, friends. If you feel like I missed something, please add it to the comments and I’ll make sure to add it.
If you’re interested in more content like this, subscribe to my newsletter Tech Bound.