Technical SEO Checklist for a Website Migration

Technical SEO Checklist for a Website Migration

In normal circumstances, i.e. outside of the specific situation of a site migration or redesign, the starting point for any successful SEO project is a technical SEO checkup. This checkup will ensure that your website can be discovered and crawled by Google and other search engines.

What Is Website Discovery & Crawling

“Most of our Search index is built through the work of software known as crawlers. These automatically visit publicly accessible webpages and follow links on those pages, much like you would if you were browsing content on the web. They go from page to page and store information about what they find on these pages and other publicly-accessible content in Google's Search index.” - Google, How Search Works

In other words, crawling is the process by which search engines discover updated content on the web, such as new sites or pages and changes to existing pages.

You can encourage Google to discover your URLs by linking to them from other key pages on your site (e.g. homepage), including them in your xml sitemap (and submitting it to Google).

Once Google has discovered your URL, it needs to decide whether it wants to expend the resources required to crawl the URL. Google takes into account the number of internal and inbound links pointing to a URL, where it discovered the URL and the newsworthiness of the URL when deciding whether to crawl it.

Why Does Technical SEO Matter?

In order to ensure that your site, and all of your content can be easily discovered and crawled by Google and other search engines, the technical underpinnings of a website should be optimised.

This includes simple things like:

  • - Making sure that you’ve got an xml sitemap set up and submitted to search engines
  • - Making sure your robots.txt file allows crawlers to access pages that you want ranked and includes an xml sitemap directive

It also includes trickier things like:

  • - Page performance - making sure that pages load quickly so that Google’s crawler can easily and efficiently crawl them all
  • - Site structure and navigation - making sure that Google can find all of your content easily through logical site structure and optimal navigation and pagination

Technical SEO For A New Website

Ordinarily, in the early stages of an SEO project, prior to outlining and SEO Roadmap, you would carry out a technical SEO audit to identify issues and opportunities to allow your existing site to be more easily crawled by Google.

However, in the context of a website migration, to save time and effort, you can very often forego this step and instead simply ensure that your new website meets technical SEO best practices through the use of a technical SEO checklist.

Provide the checklist below to your website team (designers and developers) at the earliest opportunity to ensure that your site is designed and developed in a manner that is “SEO ready”.

Technical SEO Checklist for a New Website

ID Item Why is it important? What to do?
1 Password Protect Dev & Staging sites Having a staging environment that's indexed by search engines can lead to duplicate content issues when the staging environment and the production environment are highly similar. The best way to prevent both users and search engines from gaining access to your development and staging environments is to use HTTP Authentication.
You can whitelist IP addresses at your office, and provide access to relevant 3rd parties via username/password combinations.
2 XML Sitemap XML sitemaps feed search engines data on the pages of the site, giving them a helping hand to find and index content. Create and optimise XML sitemaps (/sitemap.xml which includes all the new URLs and /sitemap2.xml which includes all the legacy URLs) and submit to search engines.
3 Robots.txt This file instructs robots (typically search engine robots) how to crawl and index pages on their website and restricts access to certain pages. Ensure robots.txt file includes sitemap directives and does not contain any unnecessary disallow directives.
4 Crawl Depth Crawl depth refers to the number of clicks you need to reach a specific page from the homepage using the shortest path. A page directly linked to the homepage is at depth 1 (the homepage itself is always 0). Deep pages have a lower pagerank because search engines are less likely to find them and so to crawl them. If a page is hard to find, crawlers won’t check them as often as pages at depth 2 and so they will lower their chance to be ranked. Minimize the number of clicks needed to reach each page on your website through optimized navigation or internal linking or a combination of both. Aim for a max crawl depth of 2. 3 is acceptable. 4 is not ideal but can be unavoidable for massive e-commerce sites or blogs.
If there are key pages beyond 4 clicks from the homepage, ensure relevant internal links are set up on more prominent pages in order to decrease the number of clicks required to reach them. For priority pages you can also review where on the site other relevant internal links can be added to ensure these pages are more prominent and clear for both users and search engines.
5 Site Structure A logical site structure has 2 key benefits for SEO.
1. It helps to spread link equity (page authority) throughout your site
2. It ensures that crawlers can efficiently find all of the pages on your site without working too hard (and avoid crawl budget issues)
Make sure that your site has a logical site structure with evenly spread categories and sub-categories for key products/services/content categories.
Typically, you should allow users and crawlers to easily navigate within sections and sub-sections by following the rules below:
1. All category pages should be accessible via primary navigation
2. Secondary navigation within categories should link to related content
3. Related content should be clustered together in topical clusters (more on that later)
6 Website URL Structure A well-crafted URL provides both humans and search engines an easy-to-understand indication of what the destination page will be about. URLs are a minor ranking factor search engines use when determining a particular page or resource's relevance to a search query. And finally, well-written URLs can serve as their own anchor text when copied and pasted as links. Keeping URLs as simple, relevant, compelling, and accurate as possible is key to getting both your users and search engines to understand them. Use words that people can comprehend, use hyphens to seperate words not underscores, spaces or any other characters apart from words and hyphens. Use lower case always! Avoid URL parameters in so far as possible.
7 HTTPS Aside from HTTPS just being best practice in terms of your site security, it is also a ranking factor in Google's search algorithm, so websites that use it are more likely to rank higher in search results. This is because Google wants to provide its users with the most secure experience possible, and HTTPS is a key part of that. For security, SEO and UX purposes, you're site should be secured with an SSL certificate.
8 Usability Core Web Vitals are a set of metrics related to speed, responsiveness and visual stability, to help site owners measure user experience on the web. As of June 2021 these metrics have been incorporated into Google’s ranking algorithm. Ensure that your web pages score "Good" in Google's Core Web Vitals tests for LCP (Largest Contentful Paint), FID (First Input Delay) and CLS (Cumulative Layout Shift), all of which feed into your overall Google PageSpeed Insights score.
9 Page Speed Google has indicated site speed (and as a result, page speed) is one of the signals used by its algorithm to rank pages. In addition, a slow page speed means that search engines can crawl fewer pages using their allocated crawl budget, and this could negatively affect your indexation. Ensure pages are as optimised as possible in terms of minifying CSS, Javascript and HTML; redirects, render blocking JS, server response time, images and browser caching. Again aim for a "Good" score on Googles PageSpeed insights test.
10 Canonicals Verify that canonicalization on the new site references the new site and not the old. Canonicalizing to the old site can be disastrous, as it may prevent the new site from being indexed. Self canonicalise all pages on the new site (except for pages that should canonicalise to another page).
11 Schema.org Markup Structured data in the form of schema.org markup helps Google to better understand your organisation, your site and your products and helps you to gain richer results on the SERP for higher CTRs. Add schema.org structured data markup to your website in order to specifically explain to Google your sites association with specific topics, brands, products, services, people and other entities.
12 Custom 404 A custom 404 page allows users to easily navigate your site and find something useful if they land on a page that no longer exists. Ensure that if a page does not exist on the site, the user is presented with the custom 404 page (whilst maintaining the URL that the user tried to access). Ensure your 404 page is engaging and user friendly and triggers a non-redirected 404 response code.

Check out the other blog posts in this series on SEO for a Website Migration:

  1. Technical SEO Checklist for a Website Migration
  2. Keyword Research and Existing Content Audit for a Website Migration
  3. 301 Redirect Planning and Redirect Mapping for a Website Migration
  4. Crawl Depth and Internal Linking Review (of your UAT site) for a Website Migration
  5. Staging Site Performance Review for a Website Migration
  6. SEO Go Live Checkup for a Website Migration
Get the SEO expertise you need for either a single project or a long-term partnership. Let's talk!

Related Posts

Other Categories

No items found.

Related Posts

Other Categories