Our Head of SEO, Jade Coleman outlines our top 6 regular website technical checks for the New Year.
Use a tool such and Xenu or Screaming frog to scan your website. If changes happen frequently we recommend this is completed once a month, to ensure there are no technical issues that could hinder performance. When big changes happen to the website we recommend these scans are analysed in more detail, looking for:
Having a record of regular scans will allow you to look back and compare, should anything happen to performance.
Following the Penguin algorithm updates, many backlinks are now potentially harmful to Google rankings.
It’s more important than ever to review those websites that link to you. It might be that historically, activity was carried out (with good intentions), that is not against Link Scheme guidelines (such as keyword rich anchor texts with follow links) or that other sites have been linking to you over the years without you even knowing.
It is recommended that a full backlink audit is completed on the website, especially considering the recent changes to Google’s Penguin Algorithm and the fact it is now part of Google’s core algorithm, and is real-time. We recommend the following is completed every quarter (at the least):
It is not uncommon to find content on websites that has been copied from other sites, or to find that other websites have copied some of your content. Therefore, it is important that all the content on your website is unique. It is imperative that each page on your website is, for the most part, unique. Websites with substantial levels of page similarity often find it difficult to rank in search engines.
To do this, take a snippet of text from your pages and paste into Google to see if any other pages, other than the one you’re searching from are indexed. Alternatively, a tool such as Copyscape could be used.
When Google crawls your website, it records any errors it finds within Search Console. We recommend this is checked regularly to ensure the errors it finds are kept to a minimum. If you find a sharp (or even steady) increase in errors, we recommend that these are investigated and rectified before the issue gets out of hand.
This check shouldn’t really take any longer than 60 seconds (unless you have a complex robots.txt file) but it’s an important SEO housekeeping check we recommend completing every month or following any major site changes, particularly if your website isn’t completely under your control.
At the very least you need to make sure that your robots.txt file isn’t blocking all crawler access. This is a common mistake made by developers when making website changes.
There’s a huge difference between:
Example 1 disallows nothing. Example 2, disallows everything! Make sure you check yours regularly.
There are all sorts of things that can slow a website down. Server speed is an important factor in the usability and customer satisfaction of a website. It’s also incorporated into Google’s Ranking algorithm. We therefore recommend that you check the speed of your website bi-monthly. In an ideal world, it should load in a second, but it’s important that it doesn’t gradually slow down.
There are many tools out there to check the speed of your website. One of our favourites is Gtmetrix.
Do you need support with technical SEO site checks? Would you like advice regarding your SEO marketing strategies? Discover how our team of SEO experts can support the maintenance and management of your site.