The Ultimate Guide to Technical SEO in 2024

Have you ever performed a technical SEO audit for your website?

As an SEO consultant, I start out with a technical SEO audit first for my clients.

You want to get your website ready for On-Page Optimization and Off-Page Optimization.

In this article, you will learn:

  • Every element in a Technical SEO audit
  • Explanation of the importance of these items and how to check them

I will show you how to check for issues and problems with Technical SEO elements only. The solution to fixing the errors will not be fully covered in this article.

We will cover the following important audits in this blog post

  • Accessibility Audit
  • Site Architecture Audit
  • URL Audit
  • Mobile Compatibility Audit

So, let’s get started.

The Tools Required to Perform Technical SEO Audit

You can actually perform the technical audit with Free Tools but it will consume more time. With paid tools, it will be relatively faster.

Best Free Tools 

Premium Paid Tools

Accessibility Audit

Is your website accessible by both search engines and visitors? The accessibility audit will help you check whether your site is accessible by search engines spiders and users.

Indexation check

Introduction to Indexing Check

You need to check the number of pages of your website indexed by search engines. Make sure to check on the top search engines; Google, Bing, and Yandex.

Importance of Indexing Check

Search engine algorithms love sites rich with content. 90% of business websites only consist of 3-5 pages. Do you think it’s sufficient to rank on search engines?

When you improve your consistency in publishing content, search engines will be aware of your site’s existence. The search engine bots will make regular crawling to find fresh content.

There are a few reasons for your site to have a low indexing rate:

  • Complicated site structure
  • Render blocking code
  • and many more

How To Check Your Site’s Indexed Pages?

Go to Google, Bing, or Yandex type the following in the search box:

You will see something like the following screenshots:

Google Search Engine Indexed Pages

Yandex Search Engine Indexed Pages

Bing Search Engine Indexed Pages

You can also check the number of pages indexed by Google with Google Search Console. If you haven’t set up Google Search Console, do it now with this easy guide.

If you already submitted your sitemap, just go to Google Console’s Dashboard and click on Sitemaps. Under the Submitted Sitemaps section, you can see Discovered URLs. Click on the “graph” symbol.

You can view all the valid pages, invalid 404 pages, URLs marked with “no-index”, URLs blocked by Robots.txt, and many more.

Hosting Server Uptime

Introduction to Hosting Server Uptime

Uptime is the ability of your hosting to keep your website online without any downtime. When downtime happens, your site will completely go offline.

You don’t want that to happen right?

When you purchase hosting services to host your website, you can always check their uptime. For example, Siteground offers 99.99% uptime.

Importance of Hosting Server Uptime

The stability of your website is extremely important for search engines to index. Inconsistency in your server uptime will cause negative effects on your indexing and search rankings. Even the visitors will get turned off when your site experiences downtime.

How to check your hosting server’s uptime?

If you have a website with hundreds of thousands or millions of traffic per month, monitoring your hosting server uptime is essential. High volume traffic at a specific moment can cause downtime to your hosting.

Pingdom is one of the best uptime monitoring tools in the market. Their monitoring service starts at $14.95 per month.

If you’re always encountering downtime with your hosting server it’s better to change to VPS or dedicated hosting.


Introduction to Robots.txt

It is a text file automatically generated in most cases that provides instructions to web bots especially search engine bots on how to crawl websites.

Importance of Robots.txt

With Robots.txt you can easily set instructions for bots in accessing your website. Crawl directives (meta robots) can be used to provide instructions to Google spiders on the pages that need to be indexed and not to be indexed.

How To Check Your Site’s Robots.txt?

The text file is available and accessible for every website. Just type the following in your URL box:

If you’re already using Google Search Console, you can access your site’s Robots.txt  from the search console’s dashboard. It is not available on the new interface yet. But, you can access it with the older version.

Click on the “older version” and then navigate to Crawl. You can see a robots.txt tester. Click on it.

Meta Robots Tag

Introduction to Meta Robots Tag

An HTML tag that notifies the search engines on what to index and not index and what to follow and what not to follow. It’s a simple code placed on the header section of your web pages.

These are the list of Meta robots codes:

Index – a command code for the search engine crawlers to index the specific page.
NoIndex – a command code for the search engine crawler NOT to index that specific page.
Follow – a command code for the search engine crawler to follow all the links available on a specific page.
NoFollow – a command code for the search engine crawler NOT to follow any links on that specific page.

There are more Meta Robots tag commands but these are the most important ones for SEO purposes.

This is the most important tag that you should remember

<meta name = “robots” content=”no follow”>

*Note – When you apply “no follow” command, all the links on the specific page will have no follow attributes. It is unlike the rel=”no follow” attribute you use to make a specific link not to be followed by the search engine crawlers.

The Importance of Meta Robots Tag

Your site might have confidential information that you don’t want to show everyone visiting your website. Meta robots tag will help you in reducing the chances of people finding that information. It’s difficult to make it completely invisible as people with the link to that specific page can still view it.

When you have duplicate pages or pages with similar content, you definitely don’t want to get all the pages indexed and send an indication of duplicate content to search engines. You would want to use the “no-index” meta robots tag here.

You developed powerful evergreen content on your site and have external links to other resources. If those links have a “do follow” attribute, you’re doing a favour to the site you’re linking to have greater authority.

As search engine spiders crawl sites through links, you don’t want your top pages to be affected. Using “no follow” attributes will minimize the link juice being passed to other sites.

How To Check Your Site’s Meta Robots Tag?

Just download the Free Version of Screaming Frog if you have less than 500 pages indexed If you have a paid version of the tool is the best one to use.

Insert your URL into the crawling tool. Let the tool crawl your site for 100%. Once the crawling process is finished, click on the “directives” tab on the top.

You will see many columns but the important one for us to check the Meta Robots Tag will be “Meta Robots One”. It will show you the directives of each indexed page of your website.

XML Sitemap

Introduction to XML Sitemap

XML sitemap contains the list of posts, pages, images, and even videos on your websites for easy crawling by search engine bots. Search engines use the sitemap to crawl and understand every element on your site.

The Importance of XML Sitemap

If you’ve used Google Search Console, you might come across the sitemap tab which allows you to submit your website’s sitemap. Although sitemaps submitted on the search console do not guarantee all pages to be indexed, they still help to speed up the indexing process.

There is no evidence that sitemap will provide any SEO benefits, it is good to inform search engines about the pages, posts and visual elements published on your site.

How To Check Your Site’s XML Sitemap?

You can type the following to check whether your site has an XML sitemap.

If there is no sitemap for your site  you can generate one using the following two methods:

  • If you’re using WordPress CMS, you can install plugins like Yoast SEO to generate an XML sitemap for your site.
  • For other platforms, you can use this XML sitemap generator.

Once you’ve generated the sitemaps, make sure to submit them to Google Search Console and Bing Webmaster Tools.

XML Sitemap for Images and Videos

Introduction to XML for Images and Videos

XML Images and Videos will help search engines to receive information about the images and videos published on a website. Submitting separate XML for Images and videos will help search engines’ crawling bots to easily identify those elements and index them in images and video search results.

Importance of XML for Images and Videos

By providing specific information to search engines bots about images and videos will increase the index rate. Having visually rich elements indexed on search results increases the organic traffic to the website,

How to Check Your Site’s XML Sitemap for Images and Videos

Most probably images and video sitemaps are not automatically generated for your website. You can check them by entering the following:


It’s very hard to find any FREE tool to generate images and videos sitemap.

This is a paid tool where it will cost $39.99. You just need to pay one time for this tool and you will receive Free updates for life.

Unlimited Sitemap Generator

Once you’ve generated the sitemaps just move on to Google Search Console and Bing Webmaster Tools to submit.

HTTP Status Code

Introduction to HTTP Status Code

HTTP status code is essential for Technical SEO. The HTTP code indicates how healthy is your website when search engine bots crawl your website. These codes actually provide a representation of the activity between the browser and the server.

To understand the codes, you need to understand how the browser retrieves your website from the server. When a visitor types your URL or searches for your site on search engines, the browser will send a request to the specific site’s IP address to obtain the web page.

The server where the website is stored will respond with a status code by providing the result of the request to the browser. When there are no issues or errors, HTTP 200 code will be sent to the browser along with the requested content of the website.

These are the main HTTP status codes you should be aware of:

  • HTTP 200 – This is the code used by the server when responding to the browser if there are no problems with the requested website or website content.
  • HTTP 301 – This code is used when you’re working on your site’s architecture or making changes to permalinks. The 301 redirects are used to direct the old URLs to the new ones. SEOs often use this method to avoid a drop in the link value of the old URL. If 301 redirect is not used, 404 error code of page not found will appear. You don’t want the search engine bots and visitors to see error pages.
  • HTTP 302 – Unlike 301, this status code provides a temporary redirect to a different URL. By doing this the link value will not be transferred to the new destination.
  • HTTP 403 – This code is usually used for sensitive content which can only be viewed by people with the right credentials. When the user does not have the credentials to view the content, it will show Forbidden.
  • HTTP 404 – One of the most important status codes for SEOs. When contents are deleted from your website, the server will respond to the browser with a 404 error where the page is not found. Visitors will get turned off if they are experiencing frequent 404 errors from your website.
  • HTTP 451 – It’s a new addition. This status code will notify the search engines that the requested content has been deleted due to legal reasons. When you receive legal orders to remove a particular page, using this code will provide the right explanation to the search engines
  • HTTP 500 – One of the error codes you never want to see very often. It indicates your site is down due to an unknown reason when the browser sends a request to the server to retrieve your website. Most of the time this error occurs when you use a cracked version of plugins and themes which causes malfunction. You can easily check your hosting logs to check the reason for the error.
  • HTTP 503 – 503 error message occurs when the server is unable to handle the request of the browser due to downtime of hosting. Sometimes, the website owner will request the hosting provider to produce a 503 error when they are working on website maintenance. It will notify the search engine bots that the site is down temporarily and crawl back again once it starts working.

Importance of HTTP Status Code

HTTP status codes are an important aspect of technical SEO that is overlooked by SEOs. Getting to know about the status code will allow you to handle crawl errors and rectify the issues to avoid rankings issues and low index rates.

When search engine bots come across a lot of 404 pages, it will de-value the links that are pointing to the 404 pages. It is called broken links. When search engine bots crawl your site, it will send a signal to the search engine that the site is poorly maintained.

How To Check Your Site For Errors in HTTP Status Code

You can use two tools to check status code errors on your website:

  • Screaming Frog
  • Google Search Console

If you’re using the Screaming Frog tool, just insert your website URL and wait for it to completely crawl your site. Once the crawling is done, click on the “Response Codes”. You can see all the status codes of your pages along with the status for each code.

If you want to check the errors with Google Search Console (New Version), click on the Coverage link below the Index section. The page will show you all the errors your web pages are having.


Introduction to Pagination

According to Moz, pagination is the practice of segmenting links to content on multiple pages that affects two critical elements of search engine accessibility.

Those two elements are crawl depth and duplicate contents.

Crawl Depth – It is the depth or extent of search engines in indexing pages within a website. The site with a high crawl depth will get more pages indexed on the search engine result pages compared to the one with a low crawl depth.

Duplicate pages – Duplicate pages are created when there is no systematic implementation of pagination. When multiple version of a particular content is published across your site, the search engines will be confused about the right URL to show up in the search results.

Importance of Pagination

When pagination is done right, you can avoid duplicate content penalties and increase indexing rates.

Most site owners and SEOs use tags like canonicals and “rel=prev / rel=next” tags. Little do they know that these tags are not blindly followed by search engines.

How to Check Your Site for Pagination Error

The pagination attributes should be used when you have a large website with a lot of pages or eCommerce sites with hundreds or thousands of products.

You can use the Screaming Frog tool to identify errors and find unlinked pagination URLs.

The following tutorial will help you in auditing your site’s pagination attribute:

How to Audit Pagination


Introduction to Subdomains

Subdomains are used by businesses to show a different part of their website without combining everything within the main domain. You can create as many subdomains as you want.

Some of the examples of subdomains:

Subdomains will be separate assets that work on their own without relying on the main domain.

Importance of Subdomains

Sometimes site owners ignore subdomains when SEO strategies are implemented. As mentioned earlier, subdomains are separate entities and should be considered as a website on its own.

If subdomains are used as a testing platform, it is important to set “no-index” tag in the robots.txt file to avoid unwanted contents being indexed.

If you’re having your blog or eCommerce store in the subdomains, make sure to set “index” tags on robots.txt and get the pages indexed.

How To Check Your Site’s Subdomains 

You can use the Pentest Tools to check the subdomain of your site. Commonly as the site owner, you should be aware of all the subdomains of your site unless the site is fully managed by a team of developers.

Site Architecture Audit

Your website structure is extremely important for search engines to properly crawl and rank on search result pages. In this section, I will cover the important site architecture factors you should keep an eye on.


Introduction to Breadcrumbs

Breadcrumbs help both users and search engines to understand your site structure.

According to Yoast, the term breadcrumbs came from Hansel and Gretel’s story where when they went into the woods, Hanse dropped pieces of breadcrumbs on the path so that they could find their way back home if they got lost. This breadcrumbs model is what we see as mini navigation on websites, especially eCommerce sites.

It provides an easy understanding for the visitors on where are they on your site.

Importance of Using Breadcrumbs

There are a few reasons why breadcrumbs are important for your site.

  • Search engines like Google love breadcrumbs. It helps search engine bots understand your site structure and use the exact structure on the search results.
  • Improves UX (User Experience). When there are a lot of pages on your website, visitors easily lose track. Breadcrumbs help visitors to be aware of where actually they are on the site. It provides better navigation for users which is a great indication of a good user experience.
  • Reduces bounce rates. With different pages ranking on search engines, visitors can enter your site from any page and it’s not restricted to your home page. When visitors are visiting specific pages on your website, you need to provide guidance to navigate through other pages of your site. Breadcrumbs help these visitors to easily navigate through other pages without having them exit from your site.

How To Check Your Site’s Breadcrumb

You need to understand that not every site needs breadcrumbs implemented. Only the sites with a huge number of contents and products should have it in place.

If you’re using WordPress CMS, Yoast SEO Plugin can help you implement breadcrumbs to your site.

If you’re using any other platform, you can check out the following guide on coding breadcrumb navigation menu for your website.

Coding Graceful Breadcrumb Navigation Menu in CSS3

This is how breadcrumbs look like for eCommerce sites.

Primary Navigation Menu

Introduction to Primary Navigation Menu

The main menu or also known as Top Level Navigation (TLN) is the primary navigation on a website. It helps both users and search engine bots to understand the site better.

Importance of Primary  Navigation Menu

Primary navigation menu helps users to easily navigate to important pages on your website. The right use of the primary navigation menu will produce a positive user experience. As for search engines, your important pages should be included in the primary navigation menu. Even the drop-down menu is included.

How To Check Your Site’s Primary Navigation Menu

You need to link all your important pages to the primary navigation menu. Each item on the menu should have SEO optimized titles. The primary navigation menu should also help users to minimize their clicks in reaching their target pages or contents.

And also make sure the menu is not overly added with all the pages, as it will look extremely ugly and turns away your visitors.

Introduction to Footer Menu

The Footer menu is the navigational menu that appears at the bottom of a website. This section will allow the website owners to include links to pages that they are unable to add to the main navigational menu at the top

Importance of Footer Menu

The footer menu offers human visitors and search engines to acknowledge other important pages of your website.

I like to use the footer menu to add important pages like resources, privacy policy, terms, and conditions, FAQs, and other important pages that show the quality of the site.

Some websites are one-page sites with parallax scrolling. Having a footer menu with links to important pages will help the visitors to stay longer on the site. The problem with on-page sites is that visitors would not be comfortable scrolling back to find your other pages. Having a footer menu on such sites will allow them to easily navigate to other pages of the site.

What Type of Pages You Should Add To Your Site’s Footer

The most common element on the footer will be the year and copyright symbol. You can add other important pages to the footer as well.

  • HTML Sitemap
  • Terms of Use
  • Privacy Policy
  • Social Icons
  • Awards and Certification
  • Latest articles
  • Upcoming events

URL Audit

URLs play an important role in creating the first impression among visitors and search engine bots. Moreover, it also helps search engine bots to identify what the particular page is about through the URL.

URLs help sites to have a systematic site hierarchy and pass equity for the whole site while directing visitors to the right web pages.

URL Structure

Introduction to URL Structure

As you all know URL stands for (Uniform Resource Locator), which specifically directs you to a web page on the Internet.

Your website URL always starts with the root domain ( The individual web page URLs will be defined depending on how you want them to appear.

Importance of URL Structure

Search Engines focus on delivering the best results for the searchers. The URL structure is important for search engine bots to decide what a particular web page is about.

From the user’s perspective, it will be much easier for them to understand what page on the site they are by looking at the URL.

If someone is browsing through an eCommerce site and looking at laptops. So, the webpage URL will look something like this 

The users know that they are on a laptop page of a specific brand.

How To Check Your Site’s URL Structure?

You can use the Screaming Frog Tool to identify  URL structure but you can also check them manually if you don’t have a huge number of pages on your site.

Once you crawl all the pages on your site, click on the URL tab. You can now see all the URLs for the pages in your site. If you see URLs with funny symbols and numbers, it’s better to restructure them.

URL Structure needs to be SEO friendly. You should add your keyword in your URL to show the relevancy of the URL with your content. Make sure the URL structure to be simple and memorable, organized, and has the right keywords.

If you’re using WordPress, you can head over to Settings and then click on Permalink. There are many URL structures to choose from but I will suggest you choose post name.

You can make changes to your URL using Yoast SEO Plugin if you’re using WordPress CMS. For each post and page, you will have Yoast box at the bottom where you can change your URL structure. It is called Slug and you can edit your URL structure.

**Reminder: Make sure to use “-” in-between words as search engine bots will see them as “space” between words.

If you’re using other CMS or customized programming languages for your site, you can always make changes to the URL structure.

Mobile Compatibility Audit

Google announced that since July 2018, they will be using mobile speed as one of their mobile search results ranking factors. Google announced the use of page loading speed as a ranking factor in 2010 but it was only focused on desktop search results. To make sure your site is loading fast on mobile devices, you need to look into the responsiveness of your site as well.

We will focus on mobile page loading speed as Google has rolled out mobile-first indexing where Google spiders will index your mobile pages faster compared to desktop pages.

In this section, we will look into mobile page loading speed. I won’t jump into website responsiveness and mobile-friendliness here because it is a must to have a responsive website in 2019. If you do not have one, please make sure to re-design your site. If you’re serious about your business, adapt to these important changes.

Mobile Page Loading Speed

Introduction to mobile page loading speed

Mobile page loading speed is the time taken for your web page to load for mobile users.

Importance of mobile page loading speed

As 53% of the total Internet users are accessing through mobile, it has become a norm to focus on improving your mobile page loading speed. According to recent research, mobile users abandon sites that take more than 3 seconds to load. On the other hand, page loading speed has become one of the most crucial ranking factors with Google rolling out mobile-first indexing.

When your site is lagging on mobile page loading speed, you will be left behind.

How To Check Your Site’s Mobile Page loading speed

You can use Testmysite of Think with Google to check the loading speed. Other tools like GMetrix and Pingdom can also help you in checking it. And if you’re not sure whether your site is mobile-friendly, use Mobile-Friendly Test tool.

These are the most important technical SEO elements you need to analyze and rectify before implementing SEO strategies in 2024.

If you like this post, share it with others. If there is something I’ve missed out please do share by commenting below.

Similar Posts

Leave a Reply