The Definitive Handbook for Technical SEO

Discover Techniques to Enhance Your Website’s Rankings with a Technical SEO Audit Tutorial

Enumerate Three SEO Tasks You Accomplished This Year

Are your tasks focused on backlinks, meta descriptions, and keyword research?

It’s a common approach to SEO, and many marketers start with these techniques in their quest for higher search rankings.

While these methods can certainly enhance your site’s organic visibility, they are not the only strategies that you should be using. The realm of SEO encompasses a broader range of tactics that you should explore.

Technical SEO pertains to the behind-the-scenes components that fuel your organic growth engine, encompassing aspects like page speed, mobile optimization, and site architecture. These facets of SEO may not be glamorous, but they are vital to your success.

To enhance your technical SEO, the initial step is to assess your current status by conducting a site audit. Following that, you must create a plan to address any weaknesses you discover. In this article, we will delve deeper into these two stages.

What Exactly is Technical SEO?

In essence, technical SEO includes all the actions you take to optimize your site for search engine crawlers and indexing. Technical SEO, content strategy, and link-building tactics collectively support your quest for high search rankings.

Technical SEO vs. On-Page SEO vs. Off-Page SEO

To simplify the concept of search engine optimization (SEO), it is often divided into three categories: on-page SEO, off-page SEO, and technical SEO. Here’s a brief overview of each.

On-Page SEO

On-page SEO comprises the content that conveys to search engines (and readers) what your page is all about. It encompasses image alt text, meta descriptions, keyword usage, H1 tags, URL naming, and internal linking. On-page SEO is under your control because everything is on your website.

Off-Page SEO

Off-page SEO demonstrates to search engines the popularity and utility of your page through votes of confidence, especially backlinks, or links from external sites to your site. The quantity and quality of backlinks enhance a page’s PageRank. All else being equal, a page with 100 relevant backlinks from trustworthy sites will outperform a page with 50 relevant backlinks from trustworthy sites (or 100 irrelevant backlinks from trustworthy sites).

Technical SEO

You also have control over technical SEO, but mastering it can be more challenging since it is less intuitive.

Why is Technical SEO Vital?

While it may be tempting to disregard technical SEO altogether, it is a critical element of your organic traffic. Your content may be thorough, useful, and well-crafted, but unless a search engine can crawl it, very few people will ever see it.

It’s similar to a tree falling in the forest when no one is present to hear it – does it make a noise? Without a robust technical SEO foundation, your content will make no noise to search engines.

Comprehending Technical SEO

Technical SEO can seem daunting, but it’s best to break it down into manageable pieces. If you’re like me, you prefer to handle large tasks in sections and with checklists. Surprisingly, everything we’ve discussed thus far can be grouped into one of five categories, each of which necessitates its own list of actionable items.

These five categories and their importance in the technical SEO hierarchy are effectively depicted in this stunning graphic, which is reminiscent of Maslow’s Hierarchy of Needs but with a twist for search engine optimization. (Please note that the term “Rendering” is commonly used in place of Accessibility.)

Basics of Technical SEO Audit

Before you embark on your technical SEO audit, there are a few basic requirements you should fulfill.

Before we delve into the rest of your website audit, let’s go over these essential elements of technical SEO.

It is crucial to audit your preferred domain before proceeding with a technical SEO audit. Your domain, which is your site’s URL, is an essential aspect of searchability and site identification. Selecting a preferred domain informs search engines whether you prefer the www or non-www version of your site to appear in search results. Choosing one version ensures that all users are directed to that URL and that SEO value is not dispersed. Previously, Google required webmasters to specify their preferred version, but now it automatically selects one. However, you can set the preferred domain through canonical tags if desired. Regardless of the method you choose, all variants, including www, non-www, http, and index.html, must permanently redirect to the preferred version.

The term SSL is quite common, and it’s an essential aspect of website security. SSL or Secure Sockets Layer provides an additional layer of protection between the browser and the webserver responsible for handling online requests, which enhances the security of your website. With SSL, any information that a user sends to your website, such as contact details or payment information, is less susceptible to hacking.

You can easily identify a website with an SSL certificate by the domain starting with “https://” instead of “http://” and a padlock icon in the URL bar.

Search engines give priority to secure sites, and Google announced back in 2014 that SSL would be a ranking factor. Therefore, it’s essential to set up SSL for your website and make sure that the SSL variant of your homepage is your preferred domain.

After implementing SSL, it’s crucial to migrate all non-SSL pages from http to https. Although it may seem like a daunting task, it’s worth the effort for better search engine ranking. Follow these steps to complete the migration process:

Redirect all http://yourwebsite.com pages to https://yourwebsite.com.
Update all canonical and hreflang tags accordingly.
Update the URLs on your sitemap (located at yourwebsite.com/sitemap.xml) and robots.txt (located at yourwebsite.com/robots.txt).
Create a new instance of Google Search Console and Bing Webmaster Tools for your https website and ensure that 100% of the traffic migrates over by tracking it.

Improving the speed of your website should be a top priority because visitors expect pages to load within six seconds. In fact, research shows that bounce rates increase by 90% when the page load time goes from one second to five seconds. Furthermore, page speed is a ranking factor, so it’s essential to optimize it for both user experience and SEO.

To improve your page speed, try implementing the following tips:

Compress your files, including images, CSS, HTML, and JavaScript files, to reduce their size and load faster.
Regularly audit your redirects to minimize the number of 301 redirects, which take time to process and slow down your site.
Clean up your code and minify it to make it more efficient, so it loads faster. Avoid using too many plugins, and consider custom-made themes to minimize unnecessary code.
Use a content distribution network (CDN) to store copies of your website in various locations, making it faster for users to access your site based on their location.
Leverage cache plugins to store a static version of your site, reducing the time it takes to load your website for returning users.
Use asynchronous (async) loading for scripts, so they load simultaneously with the HTML instead of being prioritized, resulting in faster loading times.
Use this resource from Google to identify speed issues on your website.

With these optimizations in place, you’ll be ready to focus on crawlability to further improve your website’s SEO.

Ensuring your website is crawlable is a critical part of your technical SEO strategy. Search engine bots crawl your website to gather information about your site, and if they are unable to crawl your pages, they cannot index or rank them. Therefore, the first step in implementing technical SEO is to guarantee that your essential pages are accessible and easy to navigate.

Below are some items to add to your crawlability checklist, as well as some website elements to audit to ensure that your pages are ready for crawling.

Checklist for Crawlability:

Generate an XML sitemap that lists all pages on your site.
Maximize your crawl budget by avoiding duplicate content and unnecessary pages.
Optimize your site architecture for easy navigation and efficient crawling.
Set a clear and consistent URL structure for your pages.
Utilize robots.txt to control which pages should be crawled and which shouldn’t.
Implement breadcrumb menus to help search engines understand the hierarchy of your site.
Use pagination for long pages or article lists to break them into smaller, more manageable sections.
Regularly check your SEO log files to identify and fix any crawling or indexing issues.

To improve your website’s crawlability, create an XML sitemap that includes your site structure. This sitemap acts as a map for search bots and helps them understand and crawl your web pages. Once you’ve completed your sitemap, submit it to Google Search Console and Bing Webmaster Tools. Make sure to keep your sitemap up-to-date as you add and remove web pages to ensure that search bots can easily find and crawl your content.

Your website’s crawl budget refers to the number of pages and resources search bots can crawl within a given time frame. Since crawl budget is limited, it’s crucial to prioritize the most important pages for crawling.

To maximize your crawl budget, follow these tips:

Remove or canonicalize duplicate pages to avoid confusing search bots.
Fix or redirect any broken links to ensure search bots can access your pages.
Ensure your CSS and JavaScript files are crawlable.
Regularly check your crawl stats for any significant changes.
Confirm that any bot or page you’ve disallowed from crawling is intended to be blocked.
Keep your XML sitemap updated and submit it to relevant webmaster tools.
Eliminate any outdated or unnecessary content from your site.
Be aware of dynamically generated URLs that can increase the number of pages on your site.

It is crucial to optimize your site architecture to ensure that search engines can easily find and crawl your website’s multiple pages. Your website’s information architecture or site structure is responsible for organizing your pages just like architectural design is the basis of a building.

To ensure that search bots understand the relationship between your pages, it is vital to group related pages together. For instance, your blog homepage should link to individual blog posts that, in turn, link to their respective author pages.

Additionally, your site architecture should reflect the importance of individual pages and vice versa. The closer a page is to your homepage, the more link equity it has, and the more importance search engines assign to it.

For instance, a link from your homepage to a specific page carries more significance than a link from a blog post. The more links to a particular page, the more “significant” it becomes to search engines.

A site architecture can be conceptualized in a hierarchy of page importance, where pages such as About, Product, and News are positioned at the top.

Ensure that the most critical pages for your business feature prominently at the top of the hierarchy with the maximum number of relevant internal links.

Establishing a URL structure entails deciding how to organize your URLs, which can be influenced by your site architecture. URLs can consist of subdomains, such as blog.hubspot.com, and/or subfolders, such as hubspot.com/blog, that indicate the URL’s destination.

For instance, a blog post titled “How to Groom Your Dog” would be located in a blog subdomain or subdirectory, resulting in a URL like www.bestdogcare.com/blog/how-to-groom-your-dog. On the same site, a product page might have a URL like www.bestdogcare.com/products/grooming-brush.

Whether to use subdomains or subdirectories or “products” versus “store” in your URLs is entirely up to you. The beauty of creating your website is that you can make the rules. However, it’s critical to follow a consistent structure across your URLs, which means you shouldn’t use blog.yourwebsite.com and yourwebsite.com/blogs on different pages. Develop a roadmap, apply it to your URL naming structure, and stick to it.

Here are a few additional tips for crafting your URLs:

Use lowercase characters.
Use dashes to separate words.
Keep them brief and descriptive.
Avoid using extraneous characters or words (including prepositions).
Incorporate your target keywords.

After finalizing your URL structure, submit a list of URLs of your crucial pages to search engines in the form of an XML sitemap. This provides search bots with extra context about your site, enabling them to avoid figuring it out as they crawl.

Use robots.txt effectively.

When a web crawler visits your website, it typically checks the Robot Exclusion Protocol, also known as /robots.txt. This protocol enables or disables specific web crawlers from accessing your site, including specific sections or pages. If you want to keep bots from indexing your site, you can use a noindex robots meta tag. Both of these scenarios are discussed below.

It might be necessary to block some bots from crawling your website entirely. Regrettably, some bots have nefarious intentions, such as content scraping or spamming community forums. In such cases, you can use your robots.txt to stop them from entering your site. Think of the robots.txt file as your website’s force field against bad bots.

Regarding indexing, search crawlers scour your website for hints and keywords to match your pages with relevant search queries. However, as we’ll explore later, you have a limited crawl budget that you don’t want to waste on unnecessary data. As a result, you may want to exclude pages that don’t contribute to search crawlers’ understanding of your website, such as a Thank You page from an offer or a login page.

Regardless of what you want to accomplish, your robots.txt protocol will be unique.


Incorporating breadcrumb menus on your website can help users and search bots easily navigate your site. Similar to the breadcrumbs in the Hansel and Gretel fable, these menus provide a trail that guides users back to the start of their journey on your website, showing them how their current page relates to the rest of the site.

Breadcrumb menus not only benefit website visitors, but also search bots that crawl your site. It is important to ensure that the menus are visible to users and have structured markup language to give accurate context to search bots. This can improve the user experience on your website and help search engines understand the hierarchy and relationship between your web pages.

Pagination is like numbering the pages on your research paper, but for technical SEO, it plays a different role. It helps organize content series that are split into chapters or multiple webpages. To ensure that search bots easily discover and crawl these pages, you should use pagination.

The process is simple. Go to the <head> of page one of the series and use rel=”next” to direct the search bot to the second page. On page two, use rel=”prev” to indicate the prior page and rel=”next” to indicate the subsequent page, and continue this pattern for the rest of the pages in the series. This way, search bots can understand how the pages are related to each other and crawl them efficiently.

Log files are like a journal entry where web servers record and store data about every action taken on your site. This includes the time and date of the request, content requested, and the requesting IP address. Additionally, log files can identify the user agent, which is a uniquely identifiable software like a search bot.

In the world of SEO, log files can provide valuable insights. Search bots leave a trail in log files, allowing you to determine what was crawled, when, and by whom. By checking the log files and filtering by the user agent and search engine, you can determine how your crawl budget is being spent and which barriers to indexing or access the bot is experiencing. You can access log files by asking a developer or using a log file analyzer like Screaming Frog.

However, just because a search bot can crawl your site doesn’t mean it can index all of your pages. The next layer of your technical SEO audit should focus on indexability.

Checklist for Ensuring Indexability

When search bots crawl your website, they aim to index pages that are relevant and topical. Once indexed, your pages have the potential to rank on search engine results pages (SERPs). To increase the chances of your pages getting indexed, consider the following factors:

Site accessibility: Make sure that your site is accessible to search bots by checking your robots.txt file and fixing any crawl errors.
XML sitemap: Include an XML sitemap on your site to help search bots find and index your pages.
Mobile optimization: Optimize your site for mobile devices to ensure that it can be accessed and indexed by mobile search bots.
Page speed: Make sure that your pages load quickly to prevent search bots from timing out before indexing them.
Content quality: Ensure that your pages have high-quality content that is relevant to your target audience and includes relevant keywords.
Internal linking: Link your pages to each other in a logical manner to help search bots navigate and index your site more effectively.
Duplicate content: Avoid duplicating content on your site as this can confuse search bots and hurt your indexability.

It’s important to note that unblocking search bots from accessing pages is a crucial step in ensuring indexability. You may have already addressed this during your crawlability audit, but it’s worth reiterating. To make sure that bots are directed to your desired pages and can access them without issue, you have a few options. Utilizing Google’s robots.txt tester can help you identify any pages that are currently disallowed, and the Inspect tool in Google Search Console can assist in determining why certain pages may be blocked.

Duplicate content can cause confusion for search bots, leading to negative impacts on your site’s indexability. It is important to ensure that all content on your site is unique and not duplicated across multiple pages. If you do have similar content on different pages, use canonical URLs to establish your preferred page and avoid potential penalties for duplicate content.

It’s essential to audit your redirects and ensure they are properly set up. Broken URLs, redirect loops, or incorrect redirects can adversely affect your website’s indexability during crawling. Thus, it’s crucial to conduct regular audits of all your redirects to prevent such issues.

If your website is not optimized for mobile devices yet, then you’re falling behind in the digital landscape. Google began prioritizing mobile sites over desktop as early as 2016, indexing them first. Nowadays, mobile indexing is enabled by default. To keep up with this crucial trend, you can use Google’s mobile-friendly test to assess where your website needs improvement.

HTTP errors can cause problems for search bots and negatively affect user experience by blocking them from important content on your site. It’s crucial to fix these errors quickly and effectively. The section below outlines common HTTP error codes, along with a brief explanation of each. For more detailed information on how to resolve them, follow the links provided.

301 Permanent Redirects: Used to permanently send traffic from one URL to another. Too many of these can slow down your site and hurt user experience. Avoid redirect chains, as too many can cause search engines to give up crawling the page.
302 Temporary Redirect: Temporarily redirects traffic from one URL to another. Cached title tags, URLs, and descriptions will remain consistent with the origin URL, but if the redirect stays in place long enough, it will eventually be treated as a permanent redirect and those elements will pass to the destination URL.
403 Forbidden Messages: Means that the content a user has requested is restricted based on access permissions or due to a server misconfiguration.
404 Error Pages: Tells users that the page they have requested doesn’t exist. It’s a good idea to create custom 404 pages to keep visitors on your site.
405 Method Not Allowed: Means that your website server recognized and still blocked the access method, resulting in an error message.
500 Internal Server Error: A general error message that means your web server is experiencing issues delivering your site to the requesting party.
502 Bad Gateway Error: Related to miscommunication or invalid response between website servers.
503 Service Unavailable: Indicates that while your server is functioning properly, it is unable to fulfill the request.
504 Gateway Timeout: Means a server did not receive a timely response from your web server to access the requested information.

Addressing HTTP errors is important to keep both users and search engines satisfied and to keep them coming back to your site. Even if your site has been crawled and indexed, accessibility issues that block users and bots can still impact your SEO. After resolving HTTP errors, the next stage of your technical SEO audit should focus on renderability.

To begin with, it is essential to understand the distinction between SEO accessibility and web accessibility before delving into this subject. Web accessibility focuses on making your web pages user-friendly for individuals with disabilities or impairments, such as blindness or dyslexia, for example. Many web accessibility aspects are also part of SEO best practices. However, an SEO accessibility audit does not consider everything necessary to enhance the site’s user-friendliness for disabled visitors.

In this section, we will concentrate on SEO accessibility, also known as rendering, but keep in mind the importance of web accessibility while creating and managing your website.

To ensure that your website is easily accessible, it’s important to perform a renderability audit of various elements on your site. Here is a checklist of the elements to review:

Server Performance

It’s crucial to keep an eye on your server’s performance to avoid HTTP errors that hinder users and bots from accessing your site. If you notice any issues, use the resources provided above to troubleshoot and resolve them promptly. Failure to do so can result in search engines removing your web page from their index, as it’s a poor experience to show a broken page to a user.

HTTP Status

Similar to server performance, HTTP errors can prevent access to your webpages. You can use a web crawler like Screaming Frog, Botify, or DeepCrawl to perform a comprehensive error audit of your site.

Load Time and Page Size

A delay in page load time can cause a server error that blocks bots from your webpages or has them crawl partially loaded versions that lack important sections of content. To prevent this, ensure your page load time is minimal. You should also keep your page size in check.

JavaScript Rendering

Google struggles to process JavaScript (JS) and recommends using pre-rendered content to improve accessibility. There are also plenty of resources available to help you understand how search bots access JS on your site and how to improve search-related issues.

Orphan Pages

Every page on your site should have at least one internal link, preferably more, depending on how important the page is. Pages without internal links are called orphan pages. These pages lack context, making it hard for bots to understand how they should be indexed.

Page Depth

Page depth refers to how many layers down a page exists in your site structure. It’s best to keep your site architecture as shallow as possible while maintaining an intuitive hierarchy. You should prioritize a well-organized site over shallowness.

Regardless of how many layers there are, keep important pages such as product and contact pages no more than three clicks away from the homepage. A structure that buries your product page too deep in your site makes it hard for users and bots to find them.

Redirect Chains

When you redirect traffic from one page to another, you pay a price in terms of crawl efficiency. Redirects can slow down crawling, reduce page load time, and render your site inaccessible if not set up correctly. So, try to keep redirects to a minimum.

Once you’ve addressed accessibility issues, you can focus on improving your pages’ ranking in the SERPs. Keep in mind that SEO accessibility and web accessibility are different concepts, and an SEO accessibility audit doesn’t cover everything needed to make your site accessible to disabled visitors.

SEO Ranking Checklist

Moving on to the essential elements of improving your search engine ranking, we will cover the technical SEO standpoint. To make your pages rank higher, you need to consider both on-page and off-page factors from a technical perspective.

All these factors work together to create an SEO-friendly site, so neglecting any of them would be a mistake. Let’s explore the key elements.

Internal and External Linking

Links are crucial for search engines to understand how to rank a page, and they provide context for where a page belongs in a search query. Links also guide users and search bots to related content and transfer page importance, improving crawling, indexing, and ranking capabilities.

Backlink Quality

Backlinks from other sites are crucial as they show that external websites view your page as high-quality and worth crawling. However, the quality of the backlinks matters significantly. Low-quality sites can actually hurt your rankings. To get quality backlinks, you can use techniques like outreach to relevant publications, claiming unlinked mentions, and providing helpful content that other sites want to link to.

Content Clusters

Content clusters link related content, allowing search bots to easily find, crawl, and index all of the pages you own on a specific topic. They act as a self-promotion tool to demonstrate your expertise on a subject and help search engines recognize your site as an authority on the topic.

Your search engine ranking is crucial in driving organic traffic growth because research shows that searchers are more likely to click on the top three search results on SERPs. To ensure your site gets clicked on, you need to focus on the final piece of the organic traffic pyramid: clickability.

To enhance your clickability on the search engine results pages (SERPs), there are certain technical aspects that you can focus on. Even though click-through rate (CTR) is largely influenced by searcher behavior, optimizing meta descriptions and page titles with relevant keywords can improve it. However, in this discussion, we will emphasize technical elements that are essential to achieve this goal.

To increase your click-through rate (CTR), you can take steps to improve your clickability on the search engine results pages (SERPs). While meta descriptions and page titles that contain keywords do impact CTR, we will be focusing on technical elements to enhance your clickability. Here are a few tactics to consider:

Implement structured data.
Win SERP features.
Optimize for Featured Snippets.
Explore Google Discover.

Since ranking and CTR are closely linked, it’s important to make your search result stand out on the SERP. This will make searchers more likely to click on your link. Follow these tips to enhance your clickability.

Structured data uses a specific vocabulary known as schema to classify and label elements on your webpage for search bots. The schema provides clear information about each element, how it relates to your site, and how to interpret it. By using structured data, search bots can easily identify the type of content on your webpage, such as videos, products, or recipes.

It’s important to note that using structured data may not directly impact clickability, but it can assist in organizing your content in a way that helps search bots comprehend, index, and potentially rank your pages.

SERP features, also known as rich results, can either benefit or harm your website’s clickability. If your website wins a SERP feature, it can increase the likelihood of clicks. However, if your website does not win a SERP feature, it will be pushed down the page under other elements such as sponsored ads, text answer boxes, and video carousels.

Rich results are elements that do not follow the traditional page title, URL, and meta description format of search results. Winning a SERP feature involves having structured data and useful content that can be easily understood by search bots.

Using structured data can increase your chances of getting rich results and other search gallery elements to the top of the SERPs. This, in turn, can increase the probability of a click-through. Examples of rich results that can be earned using structured data are articles, videos, reviews, events, how-tos, FAQs (also known as “People Also Ask” boxes), images, local business listings, products, and sitelinks.

The third strategy to enhance clickability is optimizing for Featured Snippets, which are special boxes located above the search results that provide a brief answer to the search query. Unlike structured data, Featured Snippets do not require schema markup.

The goal of Featured Snippets is to provide quick answers to search queries. According to Google, the best way to obtain a Featured Snippet is to provide the most accurate response to the searcher’s inquiry. However, HubSpot’s research has identified several other techniques for optimizing your content to appear in Featured Snippets.
Google Discover is a newly developed algorithm that categorizes and lists content specifically for mobile users. As mobile searches account for over 50% of all searches, Google has been placing greater emphasis on improving the mobile experience. With Google Discover, users can select categories of interest, such as gardening, music, or politics, to build a content library.

At HubSpot, we believe that topic clustering can increase the likelihood of inclusion in Google Discover. Therefore, we are actively monitoring our Google Discover traffic in Google Search Console to evaluate the validity of this hypothesis. We recommend that you also research this feature and invest time in it. The payoff is a highly engaged user base that has personally selected your content.

To drive organic traffic, a perfect trio of technical SEO, on-page SEO, and off-page SEO work in harmony. While on-page and off-page strategies are commonly prioritized, technical SEO is equally important in elevating your site to the top of the search rankings and delivering your content to the right audience. Implementing these technical tactics will enhance your SEO approach and yield impressive results.