Last week saw some of the biggest websites, including Amazon, Spotify and eBay, taken down by a bug that was triggered by a single customer updating their settings. Here at Reboot, we crunched the numbers to estimate this blackout could have cost Amazon an estimated $32m in sales but did the blackout also risk tanking their SERP rankings?
Ultimately, no. Industry giants, like those affected by the Fastly bug, are unlikely to experience a drop in their rankings as they’ve built trust and authority with Google over many years via high-quality content, authoritative backlinks and impressive user experience. But not all websites have this digital clout to help them withstand technical glitches, so we’re here to explain how website downtime impacts SEO performance - and how you can prevent it.
To index a website and decide on its ranking, Google uses a website crawler called Googlebot that collects data from all over the internet. Also called a ‘spider’, Googlebot explores sites via links, sitemaps and other methods to find and read new content that will (sometimes) be added to its index. Other search engines have their own website crawlers which behave similarly, however, we’re focusing on Google for now.
Googlebot will come back to crawl your site regularly to check for new content and links, when these are found, they might also be added to the index - but if a link is seen to be broken, this will also be noticed. There are several ways to determine how often the spider will return to your site, but first, you need to check its crawlability to ensure that Google is correctly indexing your site.
If your site experiences downtime when Googlebot is attempting to crawl it, the bot will be faced with an error code, just as a normal internet user would be. And although this creates a negative experience for both user and bot, the search engine crawler will try again whereas a user may just find another site that answers their query. If you’re lucky, the downtime will be negligible and Googlebot will find a happy and healthy site when it returns to crawl it. But, if Googlebot discovers a site is regularly down, this is ultimately going to impact the user experience and Google will eventually reflect this in its rankings if the issue is not addressed in time.
Obviously, any site downtime is bad and will cost you in sales or leads. But if this does happen, how long will it be before Google attempts to crawl your site again so you’ll avoid any penalisation in the rankings? SEO experts will tell you that exact crawl frequency will depend on the individual site and there is no single number (crawl frequency can be a moving target), but there are many factors that will impact it.
If you’re regularly updating your site with fresh content and optimising existing pages, then this can affect how often it is crawled. Google is responsive and wants to serve its users with the best and newest information available, so will crawl any new stuff as soon as possible to keep the index up to date.
That’s not all, however, as domain authority and popularity play a big part too. To provide its users with the best, Google will crawl sites that drive a lot of traffic and are providing engaging content more often than others. Sites with a high number of top-quality backlinks can be favoured by Google too, as this shows they are an authority in their field and the spider will come across more links back to the site.
So, when the Googlebot crawled Amazon last week and discovered it was down, odds are that it would have recognised this was a high profile site with top-notch links and regularly updated content - along with high volumes of traffic. As such, Googlebot will have returned much faster to recrawl it than it would, say a local plumbing site with lower domain authority and fewer new pages.
Whether you’re Amazon or that local plumber, the most important thing is the length of time your site is actually down for. You may be lucky and only be offline for a few minutes, or it may be the result of more serious issues that take hours, or even days, to resolve.
Google isn’t able to predict when a site will be back up, so it will return to crawl it again periodically, and while this is happening, you may experience some drops in your rankings. How serious these drops are will depend entirely on how quickly you are able to get it live once more, and how swiftly Google re-crawls it.
The frequency of your downtime will also impact your performance. If for some reason, your site is regularly going down, Google could see that as a terrible experience for the user and you could suffer in the rankings as a result. It may even cause you to be removed from the index altogether.
The Fastly issue was detected almost immediately and saw 95% of all sites back to normal performance within 50 minutes. This swift action, coupled with the domain authority of many of the sites, will have protected their rankings. But if you’re not paying close attention, longer delays to getting your site back online can seriously harm your appearance in the search listings.
There are a wide range of website monitoring tools available (like Status Cake) that will alert you in real-time if there is an issue with your site and save you a lot of stress when running your online business. It’s important to do your research when it comes to these tools, as you will want to be able to rely on them while you’re focused on other day-to-day needs.
Keeping an eye on the speed of your site can also give you a heads up if something is wrong. Site speed may not be the direct cause, but if it’s slowing down unexpectedly, this could indicate a crash is imminent or that there are unresolved server issues. Many monitoring tools will keep track of website response time and latency, but you can also check manually through Google’s PageSpeed tool.
Frustratingly, there is no hard and fast rule when it comes to downtime and the impact it has on SEO. Matt Cutts, a former senior employee at Google, has stated that a day of downtime is unlikely to bring any negative effects on your search rankings. Conversely, however, Google engineer John Mueller suggested a day of downtime can cause fluctuations in the rankings. Confusing, isn’t it?
As with all SEO, it depends. It depends on the crawl frequency, your domain authority and the length of time it is down. And as with all SEO, the answer is quality content, top-notch links and a speedy site. If you keep that as your goal, you should be able to survive website downtime with little, if any, ranking losses - providing you catch it quick enough.