Elite digital marketing
Author: Oliver Sissons last updated:
Oct 22nd 2019
Real SEO nerd - can be found checking rankings and talking about SEO at parties
Contact

Diagnosing Hidden Content Issues with the Fetch & Render Tool

Reboot Line

When trying to diagnose content issues on enterprise websites (or any website for that matter), it is important to not only consider what you are seeing on the front-end as a user, but also how a crawler would see the page even after rendering it. Innocent and well-intentioned changes or mistakes made to robots.txt files, a reliance on third-party sources to populate page content and/or poorly coded websites can all cause some serious crawling, indexing and ranking problems if not found and addressed quickly.

Making use of a fetch and render tool to view your website through the eyes of a crawler can help you:

  • Spot vital issues which would otherwise go unnoticed.
  • Identify why some of your content isn’t being crawled, indexed and/or cached.
  • Find out what content the Googlebot can’t see when crawling your site (meaning they can’t take it into account when determining where to rank it in the SERPs).
  • Diagnose content issues and identify problems with your code and/or robots.txt set up.
  • When there seems to be something holding your site back despite having an otherwise technically sound website and better link profile than your competitors.

If Google can’t crawl and index some of the content found on your pages, they could be missing valuable information and context which could help your sites rankings. Such issues can be particularly troublesome for enterprise scale websites which rely on templates to generate and publish many pages. These problems can often be seen when content is generated or loaded dynamically (through JavaScript added to a script which is hosted off-page/on another page for example).

Using one of the tools below, you should test different types of content and pages found on your site and compare the rendered versions to the page you see when you visit from your browser. If you notice any differences, they should be investigated further. You should determine if giving bots (and especially the Googlebot) the ability to see and read this content would demonstrate that the page offers more value to users/searches and/or better optimises the page for its target keyword(s).

Some tools you can use to fetch and render your site content include:

  • The fetch and render tool built into Google Search Console.
  • The TechnicalSEO.com fetch & render tool.
  • Screaming Frog.

How we diagnosed a large-scale content issue using Google’s fetch and render tool

A few months back we were carrying out one of our technical SEO audits on a large-scale training providers website. The business offers a huge range of training courses to thousands of professionals each year and as such, they have tens of thousands of pages indexed.

With pages for each course (of which there are thousands) and for each location where the course is offered across the globe, we knew that any incremental improvements on-page had the potential to cause large organic increases when applied to tens of thousands of URLs at a time.

We decided to run a few of the page templates used (subject area list pages, main course pages and course location pages) through the fetch and render tool to see if we could spot any opportunities or issues. We were surprised to notice the below.

When we put through the main course pages and the location specific course pages like the one tested above, we noticed that the date, venue, location and pricing information was not being seen by Google. With the tool we could also see that this was because AJAX was being used to populate the data, but the file holding the data was being blocked in the robots.txt file.

Google was respecting the robots.txt command which also meant that they were not loading any of the course locations and pricing information specific to each page. Without this, a lot of valuable information and context was lost. It made location pages less optimised for their target areas and (perhaps in Google’s eyes) less valuable to searchers looking for courses in those locations.

By adding a simple ‘Allow:’ command to the robots.txt file which meant Google could now crawl, index and render the relevant date, location and price data, we were able to better optimise thousands of pages and make Google aware that they were even more relevant and useful for searchers. The dates now being seen by Google could also lead them to view the information as more up-to-date and relevnat. Furthermore, when applying the same process to other pages, we believe that this could have eventually led to Google viewing the whole site as higher quality and help the entire sites rankings.

Results

Although we can’t be sure that this one change led to the below results, we can be confident that improving the content across this many pages at once would have some positive effects. For example, we noticed on a few occasions over the next few months the number of crawled and indexed pages increased substantially (according to Google Search Console coverage reporting, over 20,000 URLs were indexed overnight on one occasion!).

Given the number of URLs involved, it makes sense that it would take some time for Google to crawl enough pages to really take notice of the changes and the new content they were finding. However, once they did, it is possible Google felt it was now worth indexing a lot more URLs which could explain the dramatic increase in submitted and indexed URLs.

Note: Over the past few years this client has worked with the Reboot team to make consistent technical and content improvements. They have also been securing the highest quality white hat links through regular digital PR campaigns. All of this has meant that the site has been showing organic progress for many years and it is possible (if not highly likely) that these efforts also contributed to the more recent improvements.

Not only did the number of indexed URLs dramatically increase but the last core update saw the site experience some great gains in organic traffic and rankings. This reinforces the points that:

  • Core updates don’t only target one strategy or technique.
  • Improving all aspects of a site (everything from the website code and on page content to the links pointing at the domain) through high-quality, white hat strategies can help websites see consistent organic progress over years, not just weeks or months.
  • Some relatively small and easy to fix mistakes and/or issues can have dramatic and profound effects on organic performance.
  • Even established websites generating a considerable amount of organic traffic can suffer from quite wide-spread technical issues (we have seen this in many of our industry SEO studies like our law firm SEO audits and even our marketing SEO issues study!).
  • It is always worth running your pages through a fetch and render tool to see if any valuable content is being hidden from crawlers.

The chart above taken from SEMrush shows an increase of almost 5,000 organic visitors each month when looking at the estimated organic traffic for March 2019 compared to June 2019. Although impressive, looking at the organic traffic reported in Google Analytics shows a more consistent increase since the time the issue was first found/a fix for the issue first implemented around December 2018.

Considering this is a B2B industry with a high average order value, the almost 55,000 additional organic visitors each month shown in the screenshot above could be making a huge difference in terms of new business generated.

This experience goes to show just how important objectively auditing your website is and is a reminder that the fetch and render tool can help diagnose hidden content issues that could be having some seriously damaging effects on even the highest quality SEO campaigns.

If you are concerned that something is holding your site back in the SERPs, get in touch and find out how we can help.

Share This Post
Share this post

Copyright 2019 Rebootonline.com - All Rights Reserved.