I check Google's data on the Engineering Site all the time, again today I rechecked the Site Performance. At least for the last few weeks the time it takes to down-load a page has decreased [always a good thing]. The attached graphic shows the amount of time it takes Google to down-load pages on the site. The numbers don't represent a single page but the average of some number of pages Googlebot tried to read. I don't really think the data is that accurate, because it never takes 4 seconds to download a page, but I read the data because that's what Google is looking at to determine site performance.
There is no way to tell why my site is coming in faster. The server could be working faster, Google pulled fewer pages or Google pulled small pages; who knows? Well I can check and see how many pages Google crawled per day and it's about the same between now and mid June. Google Crawl Stats indicate an average of 673 pages per day over the last few months.
However; I've been working on making the site faster for months, but not really getting anything to work. The basic problem is that any time I update a page and make the HTML text smaller [less code], I add more data making the page larger. So I may make the html code more efficient [decreasing download times], but I add more human readable text increasing the download time.
The current data [below] indicates the average time to download pages on the site takes 2.6 seconds. This is only important because Google rates site by download speed, so it's Search Engine Optimization [SEO].
Related Blog Posts;
Page Download Times [7-15-2010] 3.1 seconds to download
Speed Performance Overview [6-16-2010] 2.8 seconds to download
Web Site Performance [4-22-2010] 3.7 seconds to download
Google now ranks pages by speed [4-14-2010] No speed data.
Website Speed Performance [4-3-2010] 3.7 seconds to load.
Monday, August 30, 2010
Download Speed vs Site Performance
Posted by Leroy at 10:50 AM
Labels: Bandwidth, Google, Search Engine, SEO
4 comments:
A few weeks ago I added a robots.txt line to block Google from reading my server stats. Google was reading these large text files that had nothing to do with my site [pages] ~ they were just server statistics. I use the stats to see how the site is doing and to check on visitors, but Google was reading these 9Meg files off my server ~ the point is they should be blocked. So Google should be reading my 14k html files and not these 9M files any longer ~ should be making my site seem faster to Google?
9/1/10 New Google data as of 8/28, now it says it takes 2.9 seconds to down-load files ~ guess I can't win this fight. I'm still faster than 50% of the sites on the web, at least as of the 28th.
12-19-10 Another up-date from December 18th indicates that the average load time is 2.8 seconds, faster than 52% of any other web site.
Now there was a big spike between September and October, reaching to 5 seconds. I don't thing it was my web site , so it must have been some issue with my server. Looks like the problem lasted around two weeks and than went back to normal.
1-10-11 I just noticed over the last week that Google is updating the Site Performance numbers each day, that's new, it was only updated about twice per week. The speed is hovering around 2.7 seconds to load the last several days.
As of the first of the year I started adding in the new Google Analytics code, but I've only added it to 150 pages ~ which are random pages I was working on any way. They say the new 'web counter' code is faster to load than the current Java script I'm using..
The site performance speed shown in Google Web Master Tools is weighted by page views. So a page with heavy traffic counts more than a page getting no traffic even if they load in the same amount of time.
Because I've only added the code to 150 pages, which have yet to be crawled yet, I don't expect to see any improvement till the end of the month ~ by then I should have another 100 changed out.
Of course any time I update a page, I also reduce the code size if possible, but it normally only amounts to a few 100 bytes, maybe 1k. However I add text at the same time so the over all page size stays the same most times.
I have started to remove the code for Amazon banners on most of the pages I've up-dated, that code also amounts to 200 to 1k byte of code.
I don't make any money from those Amazon banners, and the code just slows down the page loading time. Not that any one could tell the difference between loading a 12k or 13k file, but I've tried for a year now and those banners just don't bring in any money.
Post a Comment