Despite being deathly ill with some child brewed illness, I can’t help but offer advice when I see people with the wrong ideas. On an email list I’m a part of, I came across someone earlier today in that category. They claimed that the miniscule page speed improvements you get from having a hosting server located closer to your users has no discernible impact on your rankings, I couldn’t help myself.
Here is the reply I just had to make.
“The evidence I’ve seen shows that 20-40ms for responses from hosting in Australia compared 200ms+ for hosting in the US do affect users and this is why…
I concede 200ms is not long or at least it doesn’t seem like long but when a web page loads it takes quite a few elements to make it work so you’ve got to consider what is happening on a microlevel.
The number of elements the need to download depend on the complexity of your website but in most cases I’ve seen it’d be 10-20 and as many as 50 or even 100 in some (terrible) cases.
Each element is loaded by the visitor’s browser making a call to the website, the server at the website’s end responds.
I’ll try and do this like a “role-play”
Visitor – give me the home page “/”
Time past from US: 200ms passed
Time past from Oz: 20-40ms
The page’s HTML downloads.
The download time MIGHT be less from Oz but lets pretend it isn’t.
Time past from US: n/a
Time past from Oz: n/a (but probably less than US)
The page’s HTML is parsed by the browser. The parsing time won’t be any different US or OZ hosting.
Time past from US: n/a
Time past from Oz: n/a (both the same)
Download the rest
The browser then makes these 9 additional requests to the server but the round trip on these is delayed by the 200ms (versus 20ms).
Time past from US: 200ms
Time past from Oz: 20-40ms (but probably less than US)
These 9 requests can happen in parallel, all at once, but only when there are just under 10 requests so the result is…
Total Time past from US: 400ms + download
Total Time past from Oz: 80ms + download
This equated to about half a second versus under a tenth of a second. Most users will notice this, Google specifically did experiments to assess users reaction to results being slowed down…
and as a result they announced it was a ranking factor (probably not a very big one) in 2010
BUT if I told you many web pages have 10’s of request and at that volume they are queued on after another… that means the 200ms delay is magnified massively because each new batch of requests is 10x slower than it needs to be. There are work arounds for this in speed optimisation but before those some 1/2 decent hosting in Oz is the way to go.
There is plenty of good stuff out there about page speed and we (at my company) never host someone very far from their target customers for User Experience mostly and it is a bonus that this is also a plus for SEO”