A Comprehensive Technical SEO Audit: The Ultimate Guide to Technical SEO Checklist

Technical SEO is an aspect of everyone’s online existence, albeit not everyone is fully aware of it. When we delve deeper into SEO, it becomes evident that technical aspects are intertwined in all facets of it.
Our technical checklist for today includes tips, recommendations, and solutions for common SEO issues. Our aim is to provide a comprehensive guide to ensure that your website is user-friendly, visible in search engine result pages, functional, and easy to comprehend. We encourage you to gather all available information about your website and let us assist you in enhancing it.
Speed of Website Loading
Time is a critical factor on the internet. Websites across the globe tend to have slow loading times, taking an average of 19 seconds to load on a 3G mobile network. Research indicates that approximately 50% of users leave a website if it does not load within 3 seconds, on average.
Disclaimer & Warning: It is important to exercise caution when working with PHP, servers, databases, compression, minification, and other similar tools. If you are not proficient in these areas, making adjustments can severely damage your website. Always ensure that you have an adequate backup of your files and database before implementing any changes.
When it comes to website speed, several factors contribute to making your site efficient and user-friendly. A quicker loading time can lead to higher conversion rates and reduced bounce rates. To achieve this, we have compiled a list of essential speed optimization recommendations. You can quickly assess your website’s loading speed using Google’s Speed Test tool.
Over the years, the tool has undergone significant enhancements, including the addition of informative charts to help users comprehend the performance of large websites. An instance of such a chart is the Page Load Distribution.
The Page Load Distribution is based on two user-centric performance metrics: Contentful Paint (FCP) and DOMContentLoaded (DCL). FCP indicates the moment when the first content appears on the screen during the browser’s rendering process. DCL, on the other hand, shows when the DOM is ready, and no stylesheets are blocking JavaScript execution. These two metrics enable you to determine the percentage of content that loads faster and the ones that require improvement by analyzing the pages with average and slow speed.
Speed and optimization indicators are another example, displaying the website’s status. In the image below, we can see the FCP and DCL scores. These metrics rely on data from the Chrome User Experience, showing that the page’s median FCP (1.8s) and DCL (1.6s) rank it in the middle third of all pages. As a result, the page is poorly optimized since most of its resources are render-blocking.
Enhance Server Response Time
The time it takes for a server to load the HTML code and begin rendering a page is known as server response time. When a page is accessed, it sends a request to the server, and the time it takes to display the information is referred to as the server response time.
There are various factors that can cause a website to have slow response time. Google identifies some of these factors: slow application logic, slow database queries, slow routing, frameworks, libraries, CPU or memory starvation, etc.
The server response time plays a critical role in the amount of time it takes for Googlebot to access data, which in turn can determine whether or not a visitor is converted. According to Google, server response time should be kept under 200ms.
To test and enhance the server response time, there are three steps you should follow:
First, collect data and examine why the server response time is sluggish.
Second, measure your server response time to identify and fix any performance bottlenecks in the future.
Finally, keep an eye on any regression.
The server itself is often the reason why a website loads slowly. Therefore, selecting a high-quality server from the start is critical. In theory, moving a site from one server to another appears simple, but it may come with a host of potential issues, such as file size restrictions and incorrect PHP versions.
Choosing the appropriate server can be difficult due to pricing. If you’re a multinational corporation, you’ll most likely need dedicated servers, which are costly. If you’re just starting a blog, shared hosting services will most likely suffice, which are usually less expensive.
However, some shared hosting servers are good, while some dedicated servers are bad, and vice versa. Don’t opt for the cheapest or most well-known option. For example, HostGator has excellent shared hosting services in the United States, but its VPS services are not as good.
Enhance Image Loading Time without Affecting Visual Quality
One of the key contributors to slow website loading is the presence of images, which can be quite large in terms of file size. This affects the speed of the server and results in longer load times for users. However, optimizing and reducing the size of images can help to improve server performance and reduce load times.
Images contain a lot of data that may not be necessary for displaying the image on a webpage. By removing this excess data, the size of the image can be significantly reduced without affecting its visual appearance. This results in fewer bytes being downloaded by the browser, which in turn speeds up content rendering.
The most commonly used image extensions are GIF, PNG, and JPEG, and there are many tools available for compressing images.
Below are some tips and recommendations to optimize website images:
Utilize PageSpeed Insights.
Use dedicated tools and plugins to compress images in bulk, such as tinypng.com, compressor.io, optimizilla.com, WP Smush, CW Image Optimizer, and SEO Friendly Images.
Opt for GIF and PNG formats as they are lossless, with PNG being the preferred format. PNG formats offer the best compression ratio and better visual quality.
Convert GIF to PNG format if the image isn’t an animation.
For GIF and PNG, remove transparency if all pixels are opaque.
Reduce JPEG quality to 85% to decrease file size without sacrificing visual quality.
Choose progressive format for images over 10k bytes.
Use vector formats as they are resolution and scale independent.
Remove irrelevant image metadata such as camera information and settings.
Use the “Save for Web” option from dedicated editing programs.
In case you use WordPress, you may opt for an uncomplicated solution like the Smush Image Compression Plugin.
Update: Starting from 2019, Google PageSpeed Insights suggests the use of novel format images such as JPEG2000 or WEBP. Nevertheless, as not all browsers and devices support these formats adequately, it is still recommended to compress images conventionally, despite Google’s attempts to promote this.
You can determine the images occupying the most space on your website with CognitiveSEO’s Site Audit. Navigate to the Content section and select Images to view a list of images exceeding 500kb (note that for a photographer’s website, these images may be relatively small in size, yet it’s advisable to display the full HD version separately through a download link).
The only setback with PageSpeed Insights is its ability to assess only one page at a time.
At CognitiveSEO, we understand that many of you desire to bulk check the PageSpeed Insights. Therefore, our tool is designed to evaluate the Page Speed Insights scores simultaneously on numerous pages.
Please take note that for larger websites, this procedure may consume a significant amount of time. It is recommended that you skip this process initially and commence the first analysis to obtain all data and begin addressing some of the issues, and later initiate the PageSpeed process. The evaluation may require up to 10 seconds per page, implying that if your website has 60,000 pages, it may take up to a week to complete.
To optimize your website’s speed, it is crucial to minimize the render-blocking Javascript and CSS and structure your HTML accordingly.
Upon conducting a speed test using Google’s PageSpeed Insights, you may encounter the message “Eliminate render-blocking JavaScript and CSS in above-the-fold content” if some blocked resources delay page rendering. The tool not only identifies these resources but also provides excellent technical SEO recommendations, including:
Removing render-blocking JavaScript;
Optimizing CSS delivery.
To remove render-blocking JavaScript, you may refer to Google’s guidelines and use any of the following three methods to reduce or eliminate the use of blocking JavaScript:
Inline JavaScript;
Asynchronous loading of JavaScript;
Deferred loading of JavaScript.
In case Google detects a page that delays the time to the first render due to blocking external stylesheets, you must optimize CSS delivery. You have two alternatives:
For small external CSS resources, you should inline a small CSS file and facilitate page rendering;
For larger CSS files, use Prioritize Visible Content to minimize the size of the above-the-fold content, inline the necessary CSS for rendering, and defer loading the remaining style.
PageSpeed also identifies files that need optimization through minification techniques. HTML, CSS, and JavaScript resources are classified as resources. Depending on the situation, the tool will display a list of HTML, CSS, and JavaScript resources that require attention.
The minifying process involves following three steps, as explained by Ilya Grigorik, a Web performance engineer at Google:
Data compression: After removing unnecessary resources, you need to compress the ones that the browser needs to download. This involves reducing the size of the data to help the website load content faster.
Resource optimization: Depending on the type of information you want to provide on your site, make an inventory of your files and keep only the ones that are relevant. This helps avoid keeping irrelevant data. Once you have identified relevant information, you can determine what content-specific optimizations you need to do.
For instance, a photography website might need to have pictures with a lot of information, such as camera settings, camera type, date, location, author, and other data. This information is crucial for that particular website, while it might be irrelevant for another site.
Use Gzip compression for text-based data: This compression method works well for CSS files and HTML, which have a lot of repeated text and white spaces. Gzip compression temporarily replaces similar strings within a text file to make the overall file size smaller.
WordPress users have access to simpler solutions, such as the Autoptimize plugin which can fix render blocking scripts and CSS. To use this plugin, simply install it and access it through the Settings » Autoptimize tab to configure the settings. To fix render blocking scripts and CSS, select the JavaScript and CSS options and click on Save Changes.
W3 Total Cache is another tool available for WordPress users to fix render-blocking JavaScript. However, it requires a bit more effort to use. After installing the plugin, go to Performance » General Settings and navigate to the Minify section.
To address render-blocking JavaScript, W3 Total Cache is a WordPress tool that requires some effort to set up. Once installed, go to Performance » General Settings and locate the Minify section. Check the enable box from the Minify option and select Manual mode. Save all settings and add the scripts and CSS you want to minify.
However, keep in mind that PageSpeed Insights is only a guideline. For instance, Analytics and Tag Manager are shown as JS that block the loading of important content. However, they must be placed in the <head> section. This guide can assist you in optimizing the W3 Total Cache Plugin.
Reduce the Number of Resources & HTTP Requests
When it comes to website speed, one of the initial steps you can take is to limit the number of resources used. Whenever a user accesses your website, a request is made to the server to fetch the necessary files. The larger the files, the longer it takes for the server to respond, causing a delay in the requested action.
Multiple rapid requests always slow down a server due to various factors. It can be compared to copying a single large file on a hard disk versus copying a large number of small files. Typically, the smaller files take longer to copy because the disk needle has to keep moving. Although this is different with SSD technology where there are no needles, copying multiple files still requires more work than copying a single large file.
You can assess your HTTP requests by opening an Incognito tab in Chrome, ensuring that there are no cached requests, and then right-clicking and selecting “Inspect” (at the bottom). Next, you need to locate the “network” sub-tab and press F5 to refresh the page. This will begin monitoring the requests, and the total number of requests will be displayed at the end.
To improve website speed, it’s important to limit the number of resources and HTTP requests. However, there’s no specific number to follow as it varies depending on the page size. For large pages, it’s acceptable to have more requests, but it’s advisable to paginate them.
To reduce the overall download size, it’s essential to delete unnecessary resources and compress the remaining resources. Combining CSS and JS files into a single file can also help in minimizing the number of requests made to the server. Autoptimize and W3 Total Cache plugins mentioned above can be used to combine files.
But be cautious, as this option may cause your website to display incorrectly. It’s recommended to have a proper backup of your files and database before making any changes.
Implement a Browser Cache Policy
The browser cache feature automatically saves website resources on a user’s computer during their initial visit. When they return to the site, those resources can help speed up their browsing experience. This caching strategy is especially useful for returning visitors.
In case a page is unavailable at a certain moment, users can view the cached version directly from the search engine results page.
To significantly enhance the page speed load, the most effective approach is to optimize the browser cache by configuring it to suit your requirements.
Many of the plugins that handle Minification, Compression, and Combination also act as cache plugins. Therefore, you can select any caching plugin that suits your needs, such as W3 Total Cache. However, it is preferable to combine W3 Total Cache’s caching with Autoptimize’s compression and combination features.
Moreover, utilizing a cache can make it more difficult to detect changes. To observe any modifications made to your website, you can open an Incognito tab, and periodically reset the cache from the plugin settings.
To improve the speed of your website, it’s important to reduce the number of redirects and eliminate redirect loops. Redirects can be useful for preserving link equity and fixing broken pages, but having too many can significantly slow down your site. Each redirect adds extra time for the user to reach the landing page, so it’s best to limit their use to necessary instances.
It’s important to note that having multiple redirects for a single page can create a redirect loop, which is not only confusing for the browser but also results in a poor user experience. To avoid this, ensure that each page has only one redirect.
To prevent losing users when they encounter a 404 error page, it is important to customize the page and provide guidance to the user. You can design a user-friendly page and redirect the user back to your homepage or to other relevant and related content.
To identify broken pages on your website, you can utilize the Google Search Console. Simply navigate to Crawl » Crawl Errors and click on “Not found” to view any errors.
A comparable function is provided by Site Explorer, which identifies the link equity that you are forfeiting (counting the number of referring domains and links for every broken page).
Another option is to utilize the Technical SEO Site Audit Tool to evaluate all of your website’s redirects. Once you have configured the campaign and the tool has completed crawling and examining your site, go to Architecture > Redirects to view the results.
To improve page speed, it’s important to avoid cluttering your site with unnecessary content such as images, plugins, and functions that aren’t being used. This is a common problem that arises over time, especially for WordPress users who may experiment with various plugins before realizing they don’t actually need them. While it’s possible to disable and uninstall these plugins, the uninstallation process in WordPress can often leave behind traces in the database, which can cause the site to slow down.
Another type of plugin that is commonly used by webmasters are sliders. While they used to be popular, recent testing has shown that they can negatively impact conversions.
In addition, sliders often load unnecessary elements onto your site. For example, the Javascript file associated with the slider may load on all pages, even though the slider is only used on the homepage.
Furthermore, if you have multiple slides with large images on your homepage, your site may become much slower due to the size of these images. Unfortunately, most visitors may not even view all of the slides if they auto-slide.
To avoid this, it’s recommended to have a development environment where you can test out different plugins until you find the ones that are essential for your site. Once you have determined which plugins to use, create a plan for implementation on the live site.
Finally, after implementation, reset the development version by deleting it and copying the updated live version to it. This will help ensure that your live site is not bogged down with unnecessary elements and more closely resembles the development version.
II. Website Functionality and Usability
Once you have optimized your website for speed, the next step is to enhance your visibility in search engines. While there are numerous factors that contribute to this, the following are some of the most important ones and common mistakes made by webmasters.
Ensuring that your website is optimized for mobile devices is crucial, given that more than 50% of all users globally browse the internet using their mobile devices. Google has prioritized mobile indexation, so it’s important to make sure your website is mobile-friendly. This means optimizing your website for mobile devices in terms of design, speed, and functionality. It’s generally better to have a responsive design rather than a separate mobile version, which requires extra steps to implement correctly using the rel=alternate tag.
You can test your website’s mobile-friendliness by using Google’s Mobile-Friendly Test Page.
Optimize Your URLs for Search Engines
Having descriptive and keyword-rich URLs are important for both users and search engines. It is recommended to get them right the first time as changing them later could negatively impact your site’s performance, user experience, and search engine visibility.
Unfortunately, many webmasters still create dynamic URLs that are not optimized. While search engines can still crawl and rank them, it’s best to avoid dynamic URLs as they can cause issues in the long run. To ensure search engine friendly URLs, it’s recommended to use static URLs that accurately describe the content of the page.
The importance of having easy-to-follow URLs has been emphasized several times before. Query parameters in URLs should be avoided as they cannot be tracked in analytics and search console. Additionally, having query parameters can make link building difficult and can cause you to miss out on linking opportunities due to their unattractive appearance.
For WordPress users, customizing the permalink structure is possible. Refer to the image below to explore the available options for your URL structure.
Creating user-friendly URLs is a simple process that involves following these three tips:
Use dashes (-) instead of underscores (_)
Keep it short
Include the focus keyword
When you prioritize user experience by creating URLs that are easy to read and include relevant keywords, you can establish sustainable links and build trust with your audience. As David Farkas notes, it’s important to consider the user’s perspective when building links.
To check whether your URLs are user-friendly, you can use the CognitiveSEO Site Audit. Once you’ve set up your campaign, navigate to Architecture > URLs.
Afterward, a comprehensive list of URLs that do not include any keywords can be viewed. In addition to this, other issues can be identified using this feature. For instance, the screenshot below (client identity has been protected by blurring the URLs) shows that there is an issue with hreflang. The titles and content for some of the secondary languages were created in the primary language instead of providing accurate content in the secondary language.
This indicates that the URLs were acceptable, but lacked detail because the material was produced in an incorrect language.
Having HTTPS (Hypertext Transfer Protocol Secure) protocol is crucial for data security as it encrypts the data and keeps it safe from man-in-the-middle attacks. In fact, Google included HTTPS protocol in their ranking factors list since August 6, 2014, and highly recommends sites to switch from HTTP to HTTPS. By implementing HTTPS, not only will you have a boost in rankings, but you’ll also enjoy other benefits such as referrer details included under “Direct” traffic source in Google Analytics and increased user trust in the website’s safety. You can easily identify if a website has HTTPS protocol by looking for a lock symbol before the URL in the navigation bar.
If your website is not using the HTTPS protocol, an information icon will be displayed instead of a lock icon in the navigation bar. If you click on the information icon, a message will appear indicating that the connection is not secure and that the website may not be safe to use.
Although moving from HTTP to HTTPS is highly recommended, it is important to take precautions when doing so to avoid losing important data. Some users have reported losing all of their shares after migrating to HTTPS, and we experienced the same issue. Therefore, we have created a guideline to help you recover Facebook (and Google+) shares after an HTTPS migration:
Determine the number of Facebook shares you have for each URL.
Set the shares for both HTTP and HTTPS to zero.
Update rel=”canonical.”
Identify Facebook’s crawler.
It is important to note that URL issues can occur when performing mass redirects, so it is advisable to have your URLs well set up from the beginning. If you need to migrate your site from HTTP to HTTPS, you can refer to this HTTP to HTTPS migration guide for assistance.
Make sure that all versions of your website are pointing to the correct, preferred version, and that people are automatically redirected to it when accessing any other version. These include HTTP and HTTPS, with and without the “www” subdomain. Identify your preferred version, such as https://www.site.com, and set up 301 redirects for all other versions to it. To check if everything is set up correctly, use the SEO Audit Tool and go to Indexability > Preferred Domain. If you don’t see the “Everything is OK” message, there may be an issue with your redirects.
Properly setting up 301 redirects is critical when migrating a website or switching from HTTP to HTTPS to maintain link equity and avoid broken pages. Here are some recommendations to ensure correct redirection:
Use 301 redirect code to redirect old URLs to new URLs
Avoid redirection loops and remove invalid characters in URLs
Verify the preferred version of your new domain (www vs. non-www)
Submit a change of address in Search Console
These recommendations overlap with the previous steps we’ve covered, but it’s essential to ensure that all necessary redirects are set up correctly.
Submit the updated sitemap to Google search console to ensure all pages are crawled and indexed correctly. Additionally, perform a check for any broken links and resources on the website to ensure that the user experience is not negatively impacted.
Ensure Your Resources Are Accessible for Crawling
Having resources that cannot be crawled is a major technical issue in search engine optimization. Crawling is the initial step that happens before indexing, which ultimately delivers your content to the users. In essence, Googlebot crawls the data and sends it to the indexer, which then renders the page. If you’re fortunate, you’ll see the page rank in SERP.
Ensuring that users see the same content that Googlebot does is crucial for successful search engine optimization. If CSS files are blocked from indexing, the pages may not be visible to Googlebot in the same way as they appear to users. Similarly, if JavaScript is not crawlable, the situation becomes more complex, especially if the website heavily relies on AJAX. To address this, codes need to be written for the server to provide an accurate version of the site to Google.
It is important to note that if Googlebot is not blocked from crawling JavaScript or CSS files, Google can render and understand web pages in the same way as modern browsers. Google recommends using Fetch as Google to allow Googlebot to crawl JavaScript.
As of 2019, a new version of Google Search Console was launched with fewer features than the old version. While the old version is still accessible, it may be phased out in the future.
Ensuring proper crawlability of your website is closely tied to the robots.txt file. By testing the robots.txt file, you can guide Googlebot on which pages to crawl and which ones to avoid, ultimately granting Google access to your content.
To view your robots.txt file online, simply search for http://domainname.com/robots.txt. It is essential to verify that the order of your files is correct in the robots.txt file.
To create or modify your site’s robots.txt file, you can utilize the Search Console’s robots.txt Tester tool. This tool is simple to navigate and will inform you if your robots.txt file prohibits Google web crawlers from accessing certain URLs. It’s recommended to ensure there are no errors in the file for optimal results.
Errors can occur when Googlebot is unable to crawl a specific URL due to restrictions in the robots.txt file. While there can be multiple reasons for this, Google only identifies a few of them.
For example, when Googlebot is prevented from accessing your website, it could be due to:
DNS issues preventing communication with the DNS server;
Misconfigured firewall or DoS protection system;
Intentional blocking of Googlebot from reaching the website.
Once you have identified the blocked resources using the Tester tool, you can test them again to ensure that your website is functioning properly. To verify the site’s crawlability on a larger scale, you can use the CognitiveSEO Audit Tool. Navigate to Indexability > Indexable Pages and search for Disallowed in Robots.txt links. By clicking on the red line, you can see a list of URLs that have been disallowed.
Ensuring that your website’s content is indexed is a crucial aspect of SEO, as it can impact your website’s search engine ranking and visibility. In an article on AudienceBloom, James Parsons, a content marketing and SEO expert, highlights the importance of the indexing phase for websites.
You can check the status of your website’s indexed pages using the Search Console, which provides valuable information. Simply navigate to the Google Index section and click on Index Status to view a chart that displays the status of your website’s indexed pages.
The ideal scenario is that the total number of indexed pages matches the total number of pages on your website, except for the pages that are not meant to be indexed. Make sure to verify if you have set up the proper noindex tags. If there is a significant difference, review them and check for any blocked resources. If everything appears to be fine, check whether some of the pages were not crawled, hence not indexed.
If you didn’t notice anything unusual, test your robots.txt file, and verify your sitemap by following steps 9 and 10.
Additionally, you can utilize the Site Audit tool to identify the URLs that have been labeled with the No-Index tag. These URLs are listed in the same section as the URLs blocked by Robots.txt (Indexability > Indexable Pages).
To avoid having an outdated XML Sitemap, it’s important to review and update it regularly.