Why Sites Lose Rankings if Unavailable

SEO Tutorials | April 18th, 2008 | Comments Off on Why Sites Lose Rankings if Unavailable

On yesterdays post I explained how your web hosting can ruin your site rankings. Let me first clarify that it is not that the web hosting company would do that intentionally but rather a web hosting’s incompetence in managing and keeping the server up and alive 24/7. With incompetence I am referring at “act quick and resolve faster” all possible server issues that you could name, DNS issues, DdoS attack, power issues etc.

As follow up I wanted to explain why your website might lose search engine rankings if your site is unavailable and the search engine spider bot is not able to crawl your website’s content.
Spiderbot Crawler

From the image above you can practically see how the spider bot acts when he finds an unreachable website that is indexed in the search engines database.

Practically the spider bot will attempt to crawl your websites content when its scheduled, when the spiderbot attempts to reach your site and start crawling your website (or rather your server) responds to the bots attempt with a message “Server Can’t Resolve” or “Server Not Found”. People that have coded the spiderbot do understand that some times servers might go offline, so they have programmed the bot so it can return back to the site on the same date but in different hour and re-attempt to crawl your site, if the spiderbot receives again a server not found message he sets a larger timeframe to come back and re-crawl the site.

So practically each time the spider bot gets a server error message he sets a longer timeframe to come back and crawl the site. I would not know the exact number of attempts a spider bot to crawl a site from which receives a server error message before it deindexes the site from their database.

But whats for sure is that if your site had good rankings for certain keywords, and your site keeps responding to the search engine spider bot that the server can’t resolve for several times the search engine spider bot is forced to re organize the SERPs of the search engines and list higher sites that do have similar content as yours that relate to those keywords (yeah, they don’t want to wait for forever to get your server issues resolved) so that surfers of the search engine can be served with content and not with server not found sites.

Aside the quick losing in search engine positions and the hard SEO work you put to reach your rankings, the earnings that this might influence there is a even worst case. Since your site was unreachable and the search engine was forced to re-organize its SERPs it will take a long time for your site to recover and obtain your old status in the SERPs (not to mention that you might even lose some backlinks as webmasters don’t like to interlink with sites that don’t resolve for a long period). I have to thank Paul for reminding me about the recover time on yesterdays post comment.

Hope this was helpful and you enjoyed reading. Don’t forget to subscribe to my RSS feed so you can read my posts from your preferred feed reader or you can receive my posts (same day published) delivered via email in your inbox.

Sharing is caring!

Article Marketing Robot

Comments are closed.