Blocking access to Googlebot?

Can someone please confirm that nothing is being done by Jodohost to block access from Google's web crawler to the wincf server?

About a week ago shared SSL was turned off on the server, which, according to the Live Chat tech, was the reason my web site was unreachable for multiple days before I discovered the problem. (I don't understand why shared SSL prevented my web site from working, particularly as I have my own SSL cert, but the tech was unable to explain this to me.)

Since then, Google has not resumed crawling my web site. When I use Google Webmaster Tools, I get the error "Missing robots.txt" when I attempt to "Fetch" any pages as Googlebot. I've tested the robots.txt file against numerous sites that check the existence and validity of a robots.txt file, and all checked out OK. The file is there, and it's formatted correctly. It's only Google that is not able to access it -- or any other file on my web site.

In researching Google's Help forums, the most frequent reason for this (apart from having an incorrectly formatted robots.txt file, which I've confirmed I do not) is that the host has blocked access to Googlebot -- my guess is because it was incorrectly flagged as abuse because it was hitting the server too frequently.

I would appreciate it if someone could look into this and confirm this has not happened, and access from Google is not being blocked, or if it is, tell me why and if/when the problem will be resolved.

Thanks,

Laurie
 
I am not seeing anything google blocked, there was one block of too large a net range on wincf server side only, but don't think google was in there, but it is possible they are using from a different range now than usual if they maybe a crawling from a new location with different IPs than what has been 'normal'. They have so much IP space, it is possible. That block was removed.
 
Back
Top