It has been a good few days since our latest SEO experiment (this one) caused some debate within the SEO industry. Much of the debate was healthy, good intentioned and welcomed. Besides the somewhat frustrating arguments presented by people who hadn’t actually read the study (but that is just Twitter I guess), many people (including John Mueller from Google) respectfully voiced their opinions on the results which we appreciated. So, to start we wanted to say thanks to all those who shared and discussed the results respectfully.
However, we did take offence to one piece published on the Search Engine Journal (SEJ) website (which you can read here). The initial published article was a great deal less objective than the current one and, seeing as there is no way for us to comment directly on it, we wanted to address some of the points here.
Below we have added our response to individual points raised in the article (some also reference points raised in the original published article which may or may not have been removed since some Twitter users pointed out how unfair and inaccurate they were).
All Shared Hosting is Bad
This is a quick point to reiterate that, despite being misquoted, we never claimed that all shared hosting was bad. The SEJ article claims:
Besides the fact that every hypothesis is pre-existing, the problem is that we never claimed that this is what we discovered. Our study found that, within our testing environment, sites sharing an IP with toxic and spammy websites ranked less strongly than those on a dedicated IP.
This point about the meaning of the phrase ‘bad neighbourhoods’, and how we used it in our write up, was something that the author was strongly against. First, we clearly state that we are talking about 'bad neighbourhoods' in the context of hosting in our definition. The fact that it has been widely associated with/known in the context of link patterns outside of this hosting context is beside the point.
The author also quotes a source later on in their own article that says that the concept of ‘bad neighbourhoods’ has been associated with hosting for years:
Quote from SEJ article: “Hosting and the bad neighborhood theory has kicked around SEO circles for years but I never felt like it held much water.”
Even if the author hadn't quoted a source that confirmed the definition we used, we clearly explained our definition of the term in the opening paragraphs. Anyone reading the rest of the experiment and interpreting the results would know that this was the context (hosting not linking patterns) that we were discussing.
Confidence in Our Findings
Another key issue, as mentioned above, was the article misquoting us and saying that we proved that all shared hosting was bad. This was not the case at all and we never claimed this.
In fact, the author of the SEJ article says “The SEO research authors hedge their statements with words like “could very well” which makes the statements less conclusive.” - this was us presenting our findings in a more objective way and stressing that the results can’t be taken to mean that all shared hosting is bad. Had we not done this, I’m sure there would have been criticism that we didn’t phrase it in this way.
We made it clear in our write up that one of the key limitations with these kinds of studies is that they don’t show how things would be effected in an actual SERP, with the many other factors Google’s algorithms take into account when determining where to rank a website. To present it as if we didn’t make this same observation in our write up is incredibly misleading.
Our experiment proved exactly what we said it did in our initial write up - that, in this controlled environment, sites sharing an IP address with low-quality and spammy sites ranked less strongly than those on a dedicated IP.
Sounding the Alarm
Next, the article says:
Yet again our comments are being misquoted and misconstrued as we never mentioned the word “wreck” and we are not "sounding the alarm" as claimed. Instead, we attached a warning that there is no way to know within the remit of this one experiment the strength such a signal possesses.
Again, this quote taken directly from the SEJ article suggests that we didn’t make this same point in our publication of the results. We did and we went as far as to identify this as a limitation of these studies in our own write up.
The idea that we had a confirmation bias was thrown around a lot, but I have yet to see a single example given of how this was the case. We went through great efforts to minimise any particular outcome.
I came across another article on SEJ (you can read it here) which was linked to which critiques a number of other SEO studies claiming that they all suffer from this same bias. I think the article makes some valid points and raises some good concerns and I believe that it was written with good intentions. It focuses on the potential negative repercussions SEO experiments and studies can have if they're used to sell products/mislead consumers (although I do not agree that the experiments given as examples in that article were published for these reasons). Unfortunately, it seems as if the SEJ critique of our recent experiment suffered from its own confirmation bias in trying to make it fit this narrative.
A great article was published by Russ Jones which argues against the SEJ article linked above and demonstrates why SEO studies and experiments can be helpful to SEOs. You can read it here.
In the article's introduction, the author assumes that our experiment was based on opinions:
However, the quote about how shared hosting has no effect on a sites ability to rank based on their professional experience over twenty years is exactly that, an opinion. Not fact.
Unless they (the author) has data ready that shows that they have been testing the effect that shared hosting did (or did not) have on a sites ability to rank (when compared to a site of the same quality on a dedicated host), this can't possibly be a fact. To present this opinion as a fact in the way that it has (to discredit our study which was actually based on data) is completely unfair.
To add to this, the author is suggesting that SEOs only test things based on official statements from Google which is absurd. The whole point of many SEO experiments is to disprove or find evidence for widely held beliefs, best practices and/or opinions in the industry, regardless of if Google has mentioned or confirmed them as a ranking factor. If Google had confirmed the hypothesis in advance, there would be no need to test it.
In a since deleted part of the SEJ article, the author strongly suggests (near enough says) that we ran this experiment purely as a link bait tactic. In the context that it was said, this unfairly discredited the 3 months of hard work and resources that went into the study. This point came across as particularly discourteous and the author subsequently added a link to our experiment once confronted by some people on Twitter.
Please note: We have requested the original SEJ article which has since been edited as we believe that it helps to demonstrate the context and motives behind the piece as well as how dismissive and discourteous the piece is/was even better than the current, watered down version does.
These are the main points we wanted to address and as there was no option for us to comment directly on the Search Engine Journal article to defend ourselves and offer the correct context to quotes taken from our write-up, we had to publish it here instead.
We have requested that SEJ gives us the opportunity to publish a direct reply but we have had no reply to our request.
Debate surrounding any SEO ideas, concepts or experiment results is a great thing and should be encouraged. However, in this case, unfortunately what could have been an interesting and useful debate was instead presented unfairly through quotes used out of context, not crediting us as a source of the data (despite the whole article being a criticism of our study) and limiting our ability to add a response to any of their criticisms.
No matter what industry they are operating in, journalists should present their critque fairly, transparently and respectfully. Using comments out of context whilst misquoting study authors without giving them the ability to provide the correct information and context is not quality journalism.