Saturday, April 24, 2010

Spider Crawl Data

So I've been blogging about page speed, loading time, or down-load time and how Google rates web sites. Well what is up with Google's spider. Google Bot has been hitting my site pretty hard. Now as I look at the Crawl errors report I find 12 pages that Google indicates as 'unavailable' ~ maybe the server is getting over loaded.

Now I have to ask if Google is the one that's slowing down my site as they continually pull down page after page. AWSTATS indicates that Googlebot has used up 198.13MBytes of bandwidth so far this month. The Yahoo spider was the next worse offender with 125MB of bandwidth used.

Was this a Catch 22 situation? I was updating hundreds of pages to reduce down-load times by adding a new search bar. GoogleBot picked up on the page up-dates and tried to retrieve them all within a few days ultimately slowing down my server. What is up with that? It may well be that I'll have to wait a few weeks before the data comes back to normal.

I mean look at the change in the time it required to download a page; a low of 68mS all the way up to 1,608mS. And if that data is true then why does the site performance tab indicate a 3.7 second average load time while the crawl rate indicates a 186mS average load time [0.186 seconds] ~ that's a pretty big difference..... I can only conclude that the page takes .186 seconds to download, but the Java Script for Google Analytics and any pic file from Google Picasa take up the rest of the time.

No comments:

Post a Comment