Skip to main content

What is Technical SEO? Fault-finding Ranking Drops

calendar_month
Date
7th March 2019
schedule
Reading Time
8 minutes
sell
account_circle
Author
philipgraves

Attract more clients with better SEO: Here's how

Part 3: What is technical SEO, and how do I fault-find ranking drops?

What is technical SEO?

Technical SEO means actions carried out to assist the web-crawling spiders deployed by search engines indexing websites. It is a very broad topic, and can include any of the following actions:

  • Uploading an accurate sitemap to your website in a standard format that can easily be read by the major search engines like Google and Bing, and where necessary notifying them of its location in their respective Webmaster Tools utilities. Ideally the sitemap should be in the form of an XML file and should include all important pages in the site. It can exclude less important pages, such as author, tag and category pages, or pages that contain the same content sorted in a different order.
  • Asking search engines to crawl or recrawl new and improved pages when you have created or updated them, instead of waiting for the search engines to discover the changes later (which can sometimes take months). Although they won’t necessarily respond straight away, this helps ensure that you gain benefits from your page additions and improvements faster. Don’t abuse this for every little change, however.
  • Checking in search engines’ Webmaster Tools utilities that every element of every page in your site can be fully read and displayed (‘rendered’) by their bots and does not contain errors. Technical issues in your website code, maybe caused by your content management system or by your web developers, can unintentionally hurt the way search engines perceive individual pages or your website as a whole; and this can count against the visibility and ranking of affected pages. In these cases, you will need someone technically skilled to diagnose and rectify the problems; or you may need to consider a new CMS.
  • Checking that you have not taken any actions that actively block spiders’ access to your site and its contents. The instructions in your robots.txt file can be critical, as they can bar spiders from accessing your site content. In some cases, the security settings of third-party firewalls and similar add-on utilities may also need to be checked in case they have been set too restrictively and stop robots from accessing your site.
  • Specifying in search engines’ Webmaster Tools utilities the preferred prefix for your domain name. Provided that you have a valid https:// certificate to allow secure browsing, this should generally be https://www, but in some cases may be simply https:// - without the ‘www.’ element. If you do not have an https:// certificate, it will be the http:// equivalent of one of these forms.
  • Further specifying the preferred form of the URL (often known as the ‘canonical form’) for every individual page in the site – in a way that can be read by spiders
  • Making sure that every page in your website that has been published and that you want to be visible can easily be reached by spiders simply by their following the internal navigation structure within your website (NB: the depth at which an item is found in the navigation can help search engines decide how important it is, with the pages most immediately accessible from the home page tending to carry the next highest weight after the home page itself)
  • Adding a type of code called breadcrumb markup to each page to suggest to search engines their ideal sequential navigational route from the home-page to the page, as in Google’s tutorial here. While you may assume that spiders can figure this out on their own, and the manual addition of breadcrumb markup might be an optional extra, it can help SEO. Explicitly showing spiders the shortest route to each page helps to ensure that pages are not unfairly penalised for how deeply they are nested from the home page.
  • Adding a type of code called structured data to the invisible part of each page to help spiders understand it – this is not strictly necessary, but it is recommended as a technical SEO measure, and can be used on blog and article content (it can also be used with video content). The elements of structured data for a typical web-page include author, date published, headline, description, image URL, publisher name, and publisher logo.

Some of the above may seem daunting to approach on a D.I.Y. basis, so unless you are confident about making technical changes, you might be best advised to leave them to an SEO specialist.

Some people also class under technical SEO certain technical and structural actions that are not strictly assisting spiders but that can help with ranking signals. These include:

  • Ensuring that you have a valid https:// security certificate enabled and that all non-https:// URLs are automatically redirected to the https:// form
  • Optimising the form of the URL of each page in your site so that it reflects the position of the page in the site navigation, includes at least one descriptive keyword of relevance to the page’s title, and is nevertheless as concise as possible
  • Optimising your website’s code for speed of page-loading. You will find that the Google utility ‘PageSpeed’ gives valuable feedback on technical impediments to page-loading speed. A developer will understand these and can make adjustments to code to improve page-loading speed performance. We are presuming here that you have already optimised the images for file size as part of the on-page SEO measures outlined earlier!
  • Implementing the Accelerated Mobile Pages (AMP) protocol to create a parallel version of your website that is optimised for page-loading speed on mobile devices. Again, a developer can help you to undertake this work. Although thanks to modern responsive web development standards, AMP is not strictly necessary for pages to function fully on mobile devices, AMP is nonetheless strongly encouraged by Google. Its adoption is growing; and the speed benefits are a positive ranking signal. Google’s emerging mobile-first indexing philosophy has only increased the importance of how websites perform on mobile devices.

How do I fault-find and respond to drops in ranking after SEO work?

Although SEO benefits are generally permanent, if you stop making improvements to your SEO, you may find after a time that your visibility starts to drift downwards. Any of the following may happen while your website stays the same:

  • The website content or SEO of one or a number of your competitors improves, with the result that they overtake you for certain queries. In this case, you may find that you need to respond by further improving your content and / or SEO in order to maintain or regain your positioning. This is one reason why SEO work tends to need to be ongoing rather than one-off. If your website stays static, it can slip in time compared with the competition simply because the competition is improving while you are standing still.
  • Search engine algorithms change and favour other websites over yours. To some degree this is out of your control, but there are sure to be underlying reasons for the algorithmic change, and if with the help of a digital marketing agency you are able to identify what improvements can be made, you are part-way to overcoming the problem. A change of SEO strategy after an algorithmic position drop can often help positions to recover, but auditing the quality of your website content and links should always be the first port of call. Major Google algorithm updates are commonly reported on popular websites dedicated to sharing search engine optimisation knowledge, such as Search Engine Land, Search Engine Round Table and Search Engine Watch.
  • Where search engine algorithms have not changed, they may have detected a relative lack of new and updated content on your site, and therefore downgrade the ‘freshness’ element in their calculations. Your best response to this may be to update and improve existing content, or add new content. Because of the inherently time-sensitive nature of news, you may need to ensure that you are continuing to add new content that is relevant.
  • Search engine algorithms stay constant, but detect reduced user engagement with your content. In this case, you may need to think about updating design and layout elements, i.e. the visual side of your website, so that users’ engagement is retained for longer. The attention spans of today’s web users for content on the web, when they have a sea of alternatives to choose from, can be very short if the user experience element is lacking; and typical expectations for designs have improved significantly over the years and will surely continue to do so.
  • Search engine algorithms stay constant, but detect reduced social media engagement with your content compared with previously. If this applies, your positions may fall away somewhat, as social media sharing and liking is an ancillary ranking signal. It is worth remembering to take a few minutes to share all sufficiently interesting new and updated content on the social media channels associated with your website, whether this be Facebook, Twitter, LinkedIn, Instagram, or others. Just be careful not to overdo this.
  • One or more important external links to your content is lost altogether when the linking page is deleted or the link to your content from it is removed by an editor or administrator. This can happen when sites close down or if they have a policy of periodically removing older content. Your best response to this may be to work with a digital marketing agency to identify and pursue new sources of links that would benefit your website, replacing the ones lost and building new ones.
  • One or more external links to your content is downgraded in its importance in the eyes of search engines. This may be because of algorithmic changes affecting all links of a certain kind or from certain types of website, or might be because a particular linking website is judged to be of poor quality compared with previously. Occasionally, a linking website may be so bad that a link from it is a liability to you. This is only likely to apply to websites that are spammy, worthless, or havens for unsafe content. In this case, you can potentially disavow the link so that it does not count against you.

 

Summary:

Where all other factors are equal, well-executed SEO should give your website content a permanent boost in visibility. Unlike pay-per-click and other direct advertising strategies, SEO does not disappear when you stop paying for it. Improvements to your search engine visibility, once undertaken, are long-lasting.

As we have seen in the examples above, there are many factors involved, and some can weigh against the benefits of work you are doing. However, it’s important to note that most of those counterweighing factors will still take effect from time to time even if you carry out no SEO work at all, leaving you in a worse ranking position than you would have been if you had invested in sensible SEO measures.

SEO is your friend. It may be tedious and time-consuming trying to undertake on your own when you are busy with your business, but you should embrace it as an ally in your marketing; and provided that you have a marketing budget, you can potentially employ people or agencies to provide professional assistance with your SEO.


At GWS Media, we have been helping businesses, charities and other organisations with their SEO for more than eighteen years. You can always get in touch with us for a free initial SEO consultation at our Bristol-based offices in the UK.

Want to succeed online?

GWS Media have over two decades of experience in all areas of online marketing work, from logo design and website design and development to copywriting, content marketing, SEO and email marketing. Contact us today for a chat about your project, and we'll be delighted to see if we can help.

Start a project