Fixing SEO Issues Post-Launch

Auditing a site post-launch lets you catch and fix SEO issues before it's too late. Our guide will show you how to find these issues & which tools to use!

Around 90% of the search engine optimisation (SEO) work during a site migration takes place prior to launch. But the remaining 10% – post-launch – is critical. After all, it could take just one or two small missteps to potentially obliterate your digital footprint in organic search.

Thoroughly auditing a website post-launch allows you to catch and fix any of the SEO issues that may have slipped through the net – and ultimately protect your SERP real-estate.

Below are step-by-step instructions on how to:

  • Audit redirects
  • Assess internal links
  • Identify duplicate content
  • Monitor NAP data (name,address and phone number)
  • Validate hreflang attributes
  • Test schema markup
  • Spot-checking the XML sitemap
  • Examine the robot.txt file
  • Identify broken external links
  • Bulk analyse and optimise title tags and meta descriptions
  • Perform page speed tests

For a full checklist of all the pre-migration, during-migration and post-migration SEO tasks as well as a list of some of the useful tools referenced within this guide, please download our site migration SEO checklist.

Auditing redirects

Immediately following the launch of a website, it’s essential to audit all website redirects to identify any instances of rogue 404s as well as redirect chains that may have occurred due to the migration.

Redirect chains can significantly affect page load speed and a high volume of 404s can impact search engine visibility. Addressing issues with redirects will not only give visitors a more consistent and high quality user experience – it can also have the potential to improve rankings.

Below are two of the most useful tools to use when auditing redirect chains – Screaming Frog and IIS SEO Toolkit. Screaming Frog is an SEO spider tool which can crawl websites and fetch key website data in real-time. And IIS SEO Toolkit is a free optimisation tool which performs a detailed analysis of your website and will provide suggested fixes for the specific migration issues we’ve identified.

Screaming Frog

To find redirect chains using Screaming Frog, follow the below steps:

  1. Navigate to 'configuration' in the top navigation > select 'spider' > select 'advanced' tab > tick 'always follow redirects'

  2. Navigate to 'mode' > 'list' > select 'upload' and then either upload all legacy URLs from a file, enter them manually or paste all URLs > hit 'start' and run the crawl

  3. Once it reaches 100%, navigate to 'reports' within the top navigation and select 'redirect chains' from the dropdown menu

Screaming Frog

IIS SEO Toolkit

Redirect chains can be found in the following view within SEO Toolkit:

  1. Select 'create a new analysis' > name the crawl & input the domain into the 'start URL'

  2. Within the 'advanced settings', customise the settings to suit your needs

  3. Navigate to 'violations' > 'violations summary' and look out for "the redirection response results in another redirection" (as shown below) > double click to open the violation and find full details of the issue and recommended action

  4. Alternatively, navigate to 'report' > 'export all violations' > open Excel spreadsheet > apply filter > filter column 'violation code' > filter to 'RedirectToRedirect' > find the full details of the issue and recommended action within the column 'violation description' and 'violation recommendation'

IIS SEO Toolkit

Assessing internal links

Updating internal links following any URL changes may seem like an arduous task but it’s essential.

A strong internal linking structure helps to establish information hierarchy and improves crawl efficiency – as well as improving user experience by aiding navigability. The flipside, of course, is that broken links or internal redirect hops can have a negative effect on UX and page speed.

To provide a faster and clearer path to content, follow the below steps to identify and eliminate redirect hops.

Screaming Frog – option 1

  1. Crawl site in Screaming Frog

  2. Select 'bulk export' > 'all in-links'

  3. Save the export & open in Excel

  4. Filter out external destination URLs

  5. Review errors within 'status code' column

  6. Edit internal links within CMS

Screening Frog internal links 1

Screaming Frog – option 2

  1. Crawl site in Screaming Frog

  2. Navigate to the 'response codes' tab and within the drop-down menu, select 'client error (4xx)'

  3. Select 'inlinks' within the bottom navigation to view which pages link out to the 404s

Screaming Frog internal links 2

IIS SEO Toolkit

  1. Select 'create a new analysis' > name the crawl & input the domain into the 'start URL'

  2. Within the 'advanced settings', customise the settings to suit your needs

  3. Select ‘content’ in the left-hand navigation > select ‘pages with broken links’

  4. By double clicking on each URL, you can see the broken link violation details

  5. For a complete view of all broken links, select ‘report’ in the top navigation > ‘export all violations’

  6. Once the Excel spreadsheet is open, apply a filter within the ‘violation code’ column and look for ‘HasBrokenLinks’

  7. The ‘violation description’ column will then provide the full broken link details

IIS SEO Toolkit

Identifying duplicate content

Duplicate content should be avoided at all costs. The reason for this? Google’s algorithms are designed to filter out duplicate content from the search results – and that could mean entire pages losing search visibility as a result.

Here are some of the best tools to help you to identify and address any duplicate content issues that may exist on your website:

Copyscape

Copyscape is a useful free tool to use. For example, there is the free comparison function to compare the similarity between two documents.

Siteliner

Siteliner is another free online tool that can help you to identify duplicate content. Simply input the domain and hit ‘go’.

Once the report is ready, you can get a view of the duplicate content that exists on the site (as the screenshot below demonstrates). Simply click ‘here’ to see further details about your duplicate content and which pages are affected.

siteliner

SearchMetrics

SearchMetrics is an online marketing platform which offers a holistic consulting approach to help businesses develop and execute intelligent digital marketing strategies.

To perform a similar content check, follow the below steps:

  1. Select the website project

  2. Navigate to ‘optimization’ in the top navigation > ‘content optimization’ (as the below screenshot demonstrates)

  3. Fill in the relevant fields, select the appropriate search engine & select ‘start the analysis’

  4. Scroll down to the ‘similar content check’ (as the second screenshot shows) > select ‘detail analysis’ to dive further into the data

searchmetrics1 searchmetrics2

IIS SEO Toolkit

Follow the below steps to find duplicate content that exists across your website:

  1. Select ‘create a new analysis’ > name the crawl & input the domain into the ‘start URL’

  2. Within the ‘advanced settings’, customise the settings to suit your needs

  3. Select ‘content’ in the left-hand navigation > then ‘duplicate files’

  4. Select ‘actions’ in the top right-hand side > then ‘open query’

  5. The duplicate content issues can be viewed here or can be exported into a CSV file by pressing ‘export’

duplicate files

Monitoring NAP data (name, address and phone number)

Local NAP (name, address and phone number) data is extremely important for brick-and-mortar businesses.

Consistency of data is one of the top local ranking factors and inconsistency of data is a negative ranking factor.

With that in mind, it’s imperative to monitor your local citations and maintain consistency. This is especially important for companies that have undergone a rebrand.

An efficient way of monitoring your website citations is to use one of the well-respected local SEO tools for example:

Each of these tools offer local citation finding tools – which can scrape all of your local listings across the web (below is a screenshot of Whitespark’s local citation finder). Each of these tools also offer a cleanup service for an additional cost.

whitespark tool

Checking local rankings

As well as ensuring all local citations are accurate and up-to-date, monitoring local rankings is essential. BrightLocal has a local search results checker. Here, you can view your localized search rankings from any location in the world.

Google My Business insights

Your Google My Business account should also contain the correct and up-to-date NAP data. Once this has been updated, you can monitor how customers are finding and interacting with your Google listing by navigating to the Insights section within GMB.

Below are examples of some of the insights that Google My Business offers currently:

my business1 my business2 my business3

Validating hreflang attributes

When hreflang is implemented, Google can serve the correct language or regional URL in search results.

Incorrectly marked-up hreflang and canonical tags can result in duplicate content issues and Google won’t know which page to serve in the search results. If users land on the wrong page, bounce rates will increase and organic performance may be significantly impacted.

There are a few different ways that incorrect hreflang attributes can be identified, as follows:

Hreflang Ninja

Hreflang.ninja is a free online tool, created by Distilled, that allows you to verify whether the hreflang attributes on a page are correct.

Simply input the page URL, hit ‘ninja go’ and a page similar to the one below will be returned.

Hreflang Ninja

Screaming Frog – option 1

  1. Crawl site in Screaming Frog

  2. Navigate to ‘hreflang’ tab > within the ‘filter’ feature, select your preference e.g. ‘all’, ‘incorrect language codes’ or ‘missing X-default’

  3. Save the export & open in Excel screaming frog 1

Screaming Frog – option 2

  1. Crawl site in Screaming Frog

  2. Navigate to ‘reports’ in top navigation > select ‘hreflang’

  3. Save the relevant export & open in Excel to view the full error details

screaming frog 2

Testing Schema markup

It’s important to regularly monitor your existing schema markup and stay up-to-date with Google’s structured data markup guidelines. If your site doesn’t follow these guidelines then it’s possible that Google will issue the site with a spammy structured data markup manual action which may affect how your site is displayed within search results.

Google’s Structured Data Testing Tool

As well as regularly reviewing Google’s guidelines, schema markup should be tested on a frequent basis. To test the code snippets, use Google’s structured data testing tool – it’s a free testing tool which will alert you of any errors within the code (as the below screenshot shows).

Google schema

Spot-checking the XML sitemap

Another essential task in a site migration is to ensure that the XML sitemap is up-to-date and error-free.

XML sitemaps inform search engines about the structure of a website and they exist so that search engines can crawl a website logically and with ease.

As a site will benefit from having an accurate XML sitemap, please see Google’ Search Console Help on how best to manage your sitemaps, as well as which sitemap errors to look out for.

The below step-by-step instructions demonstrate how to identify whether there are pages included within an XML sitemap that shouldn’t be.

Screaming Frog

  1. Navigate to your site’s live XML sitemap e.g. https://www.yourdomain.co.uk/sitemap.xml

  2. Right click and save the XML sitemap as an XML Document

  3. Open Excel and navigate to the ‘data’ tab > select ‘from other sources’ > ‘from XML data import’ > open the saved XML file (as the screenshot below demonstrates)

  4. Open Screaming Frog and choose ‘list’ mode

  5. Upload the XML sitemap URLs and hit ‘start’

  6. View errors and export the results

xml

Examining the robots.txt file

The robots.txt file lives at the root of a website and it exists to inform search engine crawlers which parts of a website to avoid crawling and indexing.

An accurate robots.txt file is essential – for example, accidentally blocking important files could have serious repercussions.

One of the best ways to test the robots.txt file is via the robots.txt tester within Google Search Console. Alternatively, you can use Screaming Frog as the below instructions demonstrate.

As well as manually reviewing the robots.txt file, there are domain monitoring tools such as Robotto which can alert you of any changes to the robots.txt file.

Screaming Frog – option 1

  1. Crawl site in Screaming Frog

  2. Select ‘bulk export’ > ‘response codes’ > ‘blocked by robots.txt inlinks’

  3. Save the export, open in Excel and review list of pages blocked by the robots.txt file

screaming from 1

Screaming Frog – option 2

  1. Select ‘configuration’ from top navigation > ‘robots.txt’ > ‘custom’

  2. Select ‘add’ > input domain > select ‘ok’

  3. Once the robots.txt file is returned, enter a URL path to test and select ‘test’ (Screaming Frog will then alert you as to whether this page is blocked by the robots.txt file or not)

screaming frog 2

Identifying broken external links

Over time, a website will naturally acquire organic backlinks.

By monitoring the broken external links post-migration, this will allow you to either:

  • Implement a page-to-page 301 redirect and pass the ranking power to the new site
  • Ask the webmaster to update the link to reflect the change of URL

Of course, implementing a redirect or asking a webmaster to update a link isn’t always going to be the best course of action. For pages which receive almost no organic traffic, have minimal backlinks and where there isn’t a relevant page to redirect to – a custom 404 page can be created.

Ahrefs

  1. Input the domain and select the search button

  2. Navigate to ‘backlinks’ in left-hand navigation > select ‘broken’ to view a list of the broken backlinks, the referring page and details of the anchor text

Ahref

Bulk analysing and optimising title tags and meta descriptions

A well optimised title tag and meta description can set you apart from your competitors and improve click-through rates and organic traffic.

Screaming Frog offers an easy way to review and optimise your site’s existing title tags and meta descriptions en masse:

Screaming Frog – Title Tags

  1. Crawl site in Screaming Frog

  2. Select ‘page titles’ > within the ‘filter’ feature, select your preference e.g. ‘all’, ‘duplicate’ or ‘missing’ title tags

  3. Select ‘SERP snippet’ in footer > select a URL from the list > edit the title tag to the preferred title tag

  4. Once all relevant pages have been optimised with new title tags, select ‘export’

  5. Open the exported Excel document and a full list of the optimised title tags will be found within this export

serp1 serp 2

Screaming Frog – meta descriptions

To view the existing meta descriptions and edit these within Screaming Frog, follow the same steps as above but instead of navigating to ‘page titles’, navigate to ‘meta description’ (as the below screenshot shows).

meta descriptions

Performing page speed tests

Page speed is a ranking factor for desktop and will soon become a ranking factor for mobile. To ensure that your website is loading at an optimal time, use one or more of the below tools:

Simply submit the URL and wait for the results. Each of these tools offer advice on how to fix the issues that are affecting the page speed.

That’s a wrap!

That wraps up our step-by-step guide on how to perform the all-important post-migration SEO checks.

If you’ve meticulously followed a thorough and tailored website migration checklist (like this one here) and have performed all of the above SEO checks then your website should be well placed to reach its best potential within the SERPS.

Ready to put your feet up? You deserve it – but don’t get too comfortable. Remember that it doesn’t stop there! For long-term organic success, you’ll need to stay attuned to the forever-changing search landscape and continually optimise your SEO campaign.

Google waits for no one. And that’s why websites need to stay ahead of the curve – or risk falling into the deep dark depths of the search engine results pages.

Have you just undergone a site migration? Or are you stuck in an SEO rut? Drop us a line and find out how our savvy SEO team with over 45 years’ experience can help!