The Farmer update, or Panda update was one of a few hundred algorithm changes Google makes to its search each year. The Farmer update got its name because the change was aimed at removing content farms from the search results.
A content farm is a web site set up just to trap key-word searches, but really has little or no content. Maybe hundreds or thousands of articles designed to bring in search traffic. However the articles are usually poorly written and are just filled with key words to attract traffic.
The Farmer algorithm update went live in the US on February 24th, with an update to non-US searches some time later but has yet to be announced. So if you have a site in the US and saw a decline in search traffic from Google starting on or after the 24 of last month you could be effected by the Farmer update.
One of the few comments Google has made regarding the update is that a weak page on your site could effect the entire site [which is some what new]. In the past it seemed that a weak page with little or no content would just never be indexed by Google, but never seemed to draw down other pages in the web site. Any way the forums are full of people giving advice, but the poster are just guesses.
So any way I wanted to detail a few changes that I made to the web site, not that I know I've been effected.
1. Deleted a dozen orphan pages from the server. The orphan pages had been out there for years and were mis-spelled page addresses or abandon pages that were no longer used. At one point they had news groups or what ever linking to them, so I saw no reason to show a 404 page not found. But after a few years, I'm sure that no one is reading those old news group posts and clicking in.
2. Redirected another dozen pages to something else. These page were being used, but had more weight years ago. As time has past they slowly lost content due to links going bad, or no interest from me in trying to fix them; however 10 years ago they were valid pages.
3. I removed about a dozen pages relating to advice on using Adsense, as their page views had been falling over the last few years and they did not have any relation to an engineering topic.
4. I removed a few ads on some of the pages, only because the content had decreased over time, because links on the pages were no longer working. Removing an ad was part of a normal page review and had nothing to do with the algo update. How ever some people in the forums are saying to many ads could effect a pages ranking, and pages with ads above the fold ~ I think that's garbage.
5. I fixed a number of bad links on the site, but I do that all the time. Out-going links are always going bad
6. I removed dozens of blog posting from early years on this blog. I've also done this from time to time in the past. Some times I'll write a blog post to keep the blog going, but in the big picture the post is of no interest to any one but the few people following the blog. But if Google is going to rank the entire blog based on these thin posts, then it's better just to remove them. This blog had 534 posts in Dec, but now has 512.
7. I've been optimizing HTML code on the site to reduce loading time, but I've been doing that for a year now.
8. Removing pic files from Google Picasa and storing them on the server, which should help with page loading and I've also been doing that for a year now.
9. I removed the Google tool bar because it seemed it was tracking me more than the site visitors. The tool bar measures page loading time and was showing wide variations when the load times should have been more even. I figured the tool bar was tracking 'my' page loads bringing up slower pages looking to optimize. In other words I would open a 'slow' page to fix, but showing Google a slow page. So Google was seeing more slow pages because of my tool bar usage than it normally would.
10. Updating pages with the new faster Google Analytics tracking code. I was doing that any way, and again to decrease page loading time.
11. Updating or enhancing page content ~ I do that any way.
12. Deleted or removed dozens of out going links, that were valid but seemed to be outdated ~ maybe they had a copy right of 2004 on their page. Normally the page had to many links any way, so no harm done.
13. I added a few more no-follow tags to some out going links, again their pages seems outdated.
I've been making these changes either before or after the 24th of last month and I've seen no difference in page views. However any decrease in page views that has occurred has really only occurred to a few high page-view pages, other pages are up or down as normal.
Comparing Feb to March the site is down 10%
Comparing Jan to March the site is down 0%
However March is normally higher than either of the previous months.
At the same time it could be that my site was not hit by the Algo change, rather some other sites are just doing better in the SERPS now ~ there's no way to tell. In any event these changes are normal Search Engine Optimization or SEO type changes.
Saturday, March 19, 2011
What is Googles Farmer Update
Tuesday, February 08, 2011
Increasing Blog Traffic
First you would need to write about what people are either searching for or wanting to read about.
Although I started writing this blog in 2005, I've only just started tracking visitors. Google added a 'stats' section in July of 2010. Before that, I would only track the number of referrals being sent to my Engineering Data Base. So as of 7/2010 this blog has had 15,334 pageviews, according to the data in the blog Dashboard.
Over the last 24 hours the blog received hits for posting made in 2005, 06, 07 and 2010, which should indicate that older pages still bring in traffic. That's not to say stop writing, because one of the highest page views relates to a review I wrote just 3 months ago. However the counter did just start working 6 months before that.
So sometimes I'll blog about a new product I just purchased, or review a software package I just tried. It seems that a blog posting reviewing a product brings in a lot of visits, I guess people want to find out if they should buy it as well. Many of my posting center around some electrical engineering topic, standard, or interface bus which all bring in traffic [but to a lesser degree].
Adding a graphic might also help with increasing your blog traffic, as the blog may now show up in a pic search, in addition to a web or blog only search.
The attached photo of an graphic equalizer has nothing to do with this posting topic, but it serves to introduce the links I'm adding.
Some posts are more self serving, when I post about a new page addition to the web site; however some of those posts might also serve to introduce a new product as well.
For example a week ago I added a few new pages to the engineering site, which have yet to be spidered. So what better way to get them found, than by adding them to a blog posting [and they relate to the audio pic].
A Passive Audio Base Control Circuit.
A Passive Treble Control Schematic.
A Midrange Audio Control Example.
In fact these pages have been indexed because they come up in an on-site search, but have as yet received any page views. Of course that could just be that no one is interested in a simple audio control schematic.
This blog sends about 100 visitors over to the engineering site as referring traffic. That's compared to the 2,000 plus page views the blog receives. However in the past I was promoting the web site a bit more at the bottom of this blog than I am now. Back in 2008 the blog would refer 300 web visitors per month.
Posted by
Leroy
at
12:24 PM
0
comments
Friday, January 28, 2011
Unique Vists to the Web Site
The smaller insert also shows unique web visits, but with the years overlapped one another. The smaller of the inserts also shows visits but as a year to year comparison.
The variations from month to month indicate some fluctuation in the incoming traffic. However the site receives several thousand visits per day, so many of those dips and peaks represent just one day of traffic, and some may just be the site counter going off line.
Some of the months perform better or worse than other months only due to the number of days in the month or holidays occurring during the month. A work day brings in several thousand visits, while a week-end may only bring in 50% of that. A holiday only drops the visits by 75%, mainly due to 45% of the visits coming from other countries.
This posting allows the blog to serve the graphic instead of my server, but the web site does point to this page.
Click the graphic to see the larger view.
Although it's not that relevant any longer, the large dip or drop in visits during 2005 was due to changing the web counter mid year.
.
Posted by
Leroy
at
9:31 PM
0
comments
Tuesday, January 18, 2011
Year End Site Performance
The graph shows the number of visits to the site. The number of visits indicates a visit and/or a returning visit from the same person, as opposed to unique visits, when a returning visitor is not counted. The graph compares month-to-month visits and year-to-year visits. Except for December, this year out performed every month in all the previous years.
Around mid-year I finished implementing the new Google search bar. They say it's better {?}, but the code is smaller so I changed out the code. In 2008 I changed out all the Google banners, for that new code too, which was also smaller.
Any time I update a page I look for white space I can remove to reduce page loading time. White space is a 'space' in the html code that severs no function. Normally I can reduce a page by 200 bytes for a 10k html file and 2k for a 50k page by deleting [not visible] white space. The down side is that I also add new content when I up-date a page, so in many cases removing the white space evens out with the new text.
So it's been seven months now of optimizing the web site to increase Google's Site Performance data. Now, 7 months later the site's performance is 2.8 seconds [average load time] which happens to be what the loading time was back in June. However the average 'average' is closer to 2.8 than before, which hovered into the 3 second range.
Two weeks ago I started trading out the Google Analytic tracking code, which they also say is faster. So I hope to see a speed improvement in a few more weeks, as I get more pages running the new code. Currently there are just over 300 pages that are running the new Analytic code, but the Site Performance data is weighted so heavy trafficked pages count more. I up date pages based on need, not traffic.
Any way I'm always adding new content, regardless of any SEO things done to the site.
Posted by
Leroy
at
7:31 PM
0
comments
Labels: Analytics
Friday, December 24, 2010
ReadyBoost Compatiblity with Windows 7
ReadyBoost is a procedure for using a USB flash drive to augment system memory in a windows PC. ReadyBoost started with the introduction of Windows XP, and continues with Windows Vista and now Windows 7.
I've already written a Google Knol on a basic introduction to ReadyBoost on Vista so this posting really relates to Windows 7. Read more on the USB Interface, part of an Engineering Web Site.
Basically ReadyBoost allows a user to plug in a USB thumb drive [or other flash memory card] into a USB slot and set that USB drive to act like available system memory [boosting main memory]. I did try using ReadyBoost with Windows Vista on my previous computer but never really noticed any change. However three weeks ago I purchased a new computer, which I'd like to check out.
The new PC is a Dell Studio XPS8100, with an Intel Core i7-870 processor running at 2.93GHz, using
8GB DDR3 SDRAM system memory and 8MB cache. Like any personal computer the Dell uses revision 2.0 of the USB standard. Although there are 3.0 USB flash drives, there do not seem to be any computers supporting revision 3.0 of the USB spec yet.
Anyway I figured I would investigate how Windows 7 handles ReadyBoost. First off I see that Windows 7 does support ReadyBoost, but recommends adding a flash drive having a minimum of twice the system memory [16GBytes in my case].
Doing a quick check at one retailer I find that a 16GB USB thumb drive costs from $22.99 to 79.99. By comparison four 4GB DDR3 memory sticks [16GB] cost $290. So the main memory might be costing more than 3 times that of the same size thumb drive, but it's also operating much faster than the USB drive. So USB and ReadyBoost are a quick and cheap fix to enhance system operation. It doesn't matter, I don't need to purchase more memory for a PC I just received, but I would like to see how ReadyBoost works. I only use 24% of the PCs memory now anyway [indicated by some gadget on the desktop].
The price range variation in thumb drives brings in the next issue Windows 7 had with USB drives. Microsoft indicates that ReadyBoost will only work with "Fast" flash memory, and will not function with "Slow" flash memory. The help file goes on to say that some flash memory devices may contain a combination of both. So USB thumb drives using slow flash memory will not work at all, while 'faster' USB drives may not be able to use all their available memory [the 'slow' portion]. The definition or difference between fast flash memory and slow flash memory alludes me. The difference between fast and slow may explain the wide range in prices between USB drives; that being $29.99 is slow memory and $79.99 fast memory.
Most of the USB drives I just looked at did not indicate any transfer speed, although I do list them from one particular manufacturer below. Having no transfer speed data, and no knowledge of what a 'fast' transfer speed is anyway, indicates that I should only purchase a USB thumb drive that indicates that it is ReadyBoost compatible and maybe even Windows 7 compatible [in case this fast/slow issue is new to Windows 7].
Of the four different 16GB thumb drives available from one particular store, only one indicates that it supports ReadyBoost. Others do indicate that they perform fast transfers but it seems more of a comparison to USB version 1 than any indication of true transfer speed. Looks like I'll be ordering a 16GB USB 2.0 Flash Drive from PNY in the next few days to test out ReadyBoost.
USB 2.0 Transfer Speeds:
Read Speeds; 10MB/sec., 24MB/sec., 25MB/sec., 30MB/sec.
Write Speeds; 5MB/sec, 8MB/sec., 10MB/sec., 20MB/sec.
Note that the USB standard does use terms like Slow-speed, Full-speed and High-speed, but how many people read a technical specification? Regardless, how do the terms used in the USB spec relate to fast and slow used by Microsoft?
I'll up-date the post when I receive the new thumb drive, lets hope I don't lose it at the rate I'm losing all my other thumb drives.
Posted by
Leroy
at
5:24 PM
0
comments
Labels: Computer, Hardware, Memory, Product, ReadyBoost, USB, Windows
Saturday, December 18, 2010
What Video Card Should I get
As of 2010 there are currently 4 major video interfaces used on either computers, PC monitors, televisions or both. The oldest is the VGA interface which started to appear on PCs in 1987.
The VGA interface has been upgraded a number of times and is now called the SVGA interface, although everyone still uses the generic term VGA to describe the interface [see note]. Even with its age and the fact that its the only analog interface left on a PC, it can still be found on the latest monitors and TVs for compatibility.
The search trend below tells the story, search interest in the term VGA [orange] has only dropped off slightly in the last 6 years. Now, there is no way to tell because the data is normalized, but the drop represents a large reduction in the number of searches.
The DVI connector is the second oldest video interface. The DVI was introduced in 1990 as a replacement to the analog VGA interface and was capable of both analog or digital operation. However the introduction of HDMI made the adaption fall off in the last few years. In fact the organization that developed the DVI specification disbanded in 2006. The graph shows a drop in DVI interest [Red], falling at about the same rate as the VGA interface. But at this point I think every body knows what a VGA interface is, its been twenty years.
The two interfaces showing an increase in searches are the HDMI output and the DisplayPort interface. From the graph, interest in HDMI appears to be growing faster the disinterest in DVI or VGA; but that make sense with so many fielded systems using either VGA or HDMI. The large spikes represent increased searches, probably due to news articles or new products being introduced.
The blue line down at the bottom of the graph represents Google searches for the term DisplayPort. It may appear that there is no interest in the new DisplayPort video interface, but that is only when compared to the vast number of searches being conducted for the other video card types. If you happen to zoom in, or re-normalize the graph with out the other video interfaces than the increase becomes apparent [shown below].
Interest in Displayport has doubled over the last few years, up four fold from its release date.
Many times when a interface standard is released it takes another year before products begin to hit the market. In the case of a video standard you need at least two different manufacturers producing products; one selling a video card and a different manufacturer selling a computer monitor. Then there's the issue of demand; a company making PC monitors may not want to go to the expense of designing a new interface when the mating interface is not yet available on a video card. So in some cases it may take a few years for a new video standard to take hold.
For example DisplayPort may have been released in 2007, but video cards with a DisplayPort interface may not have appear until 2008 followed sometime later by computer monitors.
Anyway the best video monitor interface to use is DisplayPort, or HDMI on a TV. They're both digital interfaces, but HDMI is more common than DisplayPort.
Note; The search trend for the term VGA shows a dramatic drop in search usage from 2004 to 2007 and a steady decline onward. But when compared to the term SVGA, SVGA does not even register on the same graph. In other words the term VGA is being searched for 20 to 25 times more than the term SVGA.
In general I keep my PC for around two years and the monitors for about four years. The two AOC 2436 monitors I'm currently using [AOC 2436-vh review] has both a HDMI and VGA interfaces. So they don't have the newest DisplayPort interface or the out-dated DVI interface, but a good mix of both analog and digital connectors. My Dell SX8100 computer uses an ATI Radeon HD 5770 video card with dual-link DVI, DisplayPort and HDMI outputs. Note the lack of a VGA interface
.
Posted by
Leroy
at
2:17 PM
0
comments
Tuesday, December 14, 2010
Browser Compatibility
![]() |
2010 Browser Usage |
Primarily I would be looking for layout changes between the different browsers, small changes in spacing or page breaks. As a rule the pages were always working between the different programs. But all that was a long time ago, I stopped checking page issues half a dozen years ago. Back then I would load all the different web browsers on the PC and check pages for compatibility.
Now I just run the three major browsers, Internet Explorer, Firefox and Chrome. However I don't run them looking for problems, I run them with each one opening different windows and or passwords.
Anyway over the last few months I've noticed page layout issues with my unique 404 page [the page that displays on a mistyped web address]. I sort of looked at it but never really did any thing about it because I couldn't find the problem. Now I realize that the layout problem was only showing up in Google Chrome, which I hadn't noticed before. With three different browsers it's kind of random which pages show up in which browser depending on what I'm working on.
So I have three different style of pages, and two of those styles use the HTML 'DIV' tag which puts a column on either the right or left side of the page. The pages that use the DIV tag to put a column on the right side of the page seems to work in all three browsers. However the pages that use a DIV to add a left hand column don't seem to work in Google Chrome. Any text or graphic that uses a 'center' tag is forced out of the column and below the content of the main section. So left margin text remains in the div column while a 'center' graphic is pushed out into the adjacent column.
Lucky for me, the pages with the issue only account for 1% of the page-views or around 37,000 pageviews. Ten percent of that are people using Google Chrome viewing only 370 pages in these sections. The 14 pages that account for most of the issue [0.67%] were just fixed as well as some of the others. The thing is, if you never scrolled down to the bottom of the page, the layout problem might not have even been noticed.
Oh, the fix is to just remove the center tag around the graphics in the left column. That was the quick fix, there must be an html coding issue with the div tags, but the point is to get them working right now....
So the advice would be to check you pages in all the browsers.
The pie chart represent 4,221,889 pageviews year-to-date for the web site.
239,690 of those page views were from people using Google Chrome.
.
Posted by
Leroy
at
2:01 PM
3
comments
Labels: Chrome
Tuesday, December 07, 2010
Google Sites Free Web Page
![]() |
V22 Osprey |
Each sub-page has the address of the upper level pages so the URLs get pretty long. Or you really need to plan ahead when generating new pages. However I don't know what page I'm going to add at any given time. So some of the pages addresses are really long. I guess it doesn't matter if a person comes to the site via a search engine, or if their navigating the site via the site-map.
The graphic is part of a report from Google Analytics, like many of the graphs in this blog, comparing last years hits or visits to this year. This year the site has had 14,532 visits, and 20,903 pageviews, over 165 different URLs. However only 133 pages are included in Google index.
Many of the pages really just hold some large graphic that didn't fit on the main site, but in most cases there is a small amount of text. I don't add pages as often as on the other site, but much of the increase in visits is due to new pages, or maybe new pictures.
.
Posted by
Leroy
at
12:28 PM
1 comments
Saturday, December 04, 2010
Increasing Page Views
I don't really track the number, but I add around two or three new pages to the Engineering Site each month.
Of course I would hope that the new page(s) would generate more incoming visitors, but there's little chance of that. Remember it takes three months just to get a page rank and start to show up in the Search Engine Pages [SEP]. Maybe another month before that until Google spiders and reads the page. Or any page I generated in the last half of the year had no chance of receiving any page views.
So the table below compares pageviews from this year to last year. It might seem like that there's not much of a change, but most pages increased in page views. However many pages still hover around the first three rows. So many of the pages I added only received just a few page views, with almost no page views during the first three months.
However there's another way to look at the situation. A new page may bring in a new visitor, which might revisit the site.
The increase in pages in the first row are new pages. Most of the rest of the pages are stuck there.
There are 50 pages that make up the 6 year old Index of Semiconductor Manufacturers section that refuse to get a Page Rank, no matter how many times I link to them. Most of the 3 year old 115 pages that make up the Transistor Derating Curves section also have never received a Page Rank, regardless of what I've done.
I just updated a 3 year old page that only received 8 page views this year. I pulled a 90k pic file off Picasa, reduced it to 50k and stored it on my server. I deleted another Picasa graphic altogether because it could be found by an on-page link. I also changed a few important key words. The point was not to get more pages views, its transistor part number [page topic] must be obsolete so the page will never receive any page views.
Instead what I did was make the page look less like the other 115 similar pages in that section, and make it load faster [for Google] to keep them happy. Files that are stored external to the site require a DNS look-up, which Google thinks slows down a site ~ part of Google's rating systems measures page loading speed. The 50k file on my server should not hurt because the page is never viewed, except by the web spiders. So having this page appear different may help the pages getting page-views, because one less page looks the same.
Anyway if a page received 1,000 page-views last year it would have to see over a 900% increase to move up to the next row. So understandably most pages stay in the same row they were found in last year, even if they had a 100% increase in views.
The site is currently receiving 7.21% more Pageviews than last year, with 3,978,019 current pageviews. So the site has already passed the number of page views received last year. The table above is Pageviews, the chart to the left is number of visits.
The program that generates the data is Google Analytics, and the report is Top Data, order from highest pageviews down [which I then just count]. Using Google Analytics requires a small amount of Java code on each of your web pages.
Many of the pages in my report with only one page view for the year are really just mis-spelled page addresses. The viewer just sees the sites 404, page not found, but the report records it as the address that the server received. Many types of 'wrong addresses can be filtered, but many can't, sometimes I can't even tell if it's a real page address with out pulling up the page. Opening the page will give it another count which could cause it to look like a real page address when I check it later. So the first row of data is an estimate, because I started to get down to the misspelled pages in the report.
This is a long post just to say that adding new pages or worrying about getting those pages included in the Google index may not help the over-all web site. You still need to come up with content that people are looking for.
.
Posted by
Leroy
at
7:39 AM
2
comments
Friday, December 03, 2010
Internet Browser Usage
Being the end of the year I figured I would post which browser visitors use for the engineering data base. Some people watch these numbers, and even small changes make one of the technical papers. But like me, once in a while they probably just need something to post about.
I could care less which browser is doing better than another, because I use all three at once. I have some pages open in Firefox, others in Explorer and still others in Google Chrome. I just wish they would put the buttons and options in the same place in each of the programs.
I don't use either Opera of Safari, but years ago I did try Opera for a while. But like I said, as soon as the PC boots up, the other three browsers open.... once. The data for 2010 account for 3,094865 unique visits so it's a big data sample. Counting the return visitors would make the data sample larger, but what's the point, it would just make the browsers on the increase look better and the ones on the decline look worse.
There are other browsers being used to access the web site but their percent usage are all below 1% ~ Below 20,000 web visits, or two days of web visits. The site receives about 9,000 hits per day.
.
.
Posted by
Leroy
at
12:07 PM
1 comments
Thursday, December 02, 2010
URLs in Web Index
This would be the third revision or up-date to the posting of the web stats showing URLs in Web Index.
The stats relate to this Engineering Web Portal.
The table of data below shows the number URLs or pages in Google's web index. First of all I don't like to generate new site maps that often because of the bandwidth requirements of the sitemap program. That is the program I run needs to check every page I have on the server, so it's a big hit to my server. But if you look at the data and the increasing number of URLs in the web index you will notice that generating a sitemap over and over again is not required.
However I do recommend running a new sitemap if you have a large amount of new pages. The number of URLs in the web index always increases after a site-map is generated. However you'll notice that in some cases the number of indexed pages falls off a few days later. I assume the program reading the sitemap finds the new pages and adds them to the web index, but sometime later another computer algorithm determines the page should not be index and they drop off the index.
01/21/11 = 1,938 urls in web index [Site-map loaded]
11/30/10 = 1,875 urls in web index.
11/04/10 = 1,854 urls in web index.
10/23/10 = 1,805 urls in web index.
07/24/10 = 1,779 indexed pages. [Site-map loaded]
07/01/10 = 1,535 indexed pages. [Site-map loaded]
06/30/10 = 1,504 urls in web index.
06/24/10 = 1,455 urls in web index.
06/18/10 = 1,426 urls in web index.
05/31/10 = 1,400 urls in web index.
05/22/10 = 1,394 urls in web index.
04/07/10 = 1,309 urls in web index.
03/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]
There are two reasons why the number of URLs in the web index are increasing. First I generate at least two or three new pages every month, sometimes many more. So you would except to see an increasing number of pages included in the web index as those new pages are found and included. Secondly I'm always updating preexisting pages already residing on the web-site. So after time the up-dates may allow a page to be indexed, usually because it has more content [text]. Many times I'll add a new page topic that is little more than a graphic and a small description. But over time I get back to up-dating the page to include more data and so on. Once the page gets 'the required amount' of content Google included the URL into the web index.
How soon your new page shows up in the web index, with out a sitemap, depends on how often your pages are crawled. The chart shows that my site has 500 pages crawled every day, although you can't tell from that how many pages are re-crawled. So for my site any new page added is found within a few days, I assume 15 days ~ so I don't really need a sitemap.
Although it has little to do with the number of URLs in the web index, over and over I see new pages not receiving any incoming hits for months. That is even as a new page gets included in the web index, it takes three months for the page to really start getting any visits. Web Masters call that the Google sand box. So don't think that just because a page gets indexed that it will bring in a larger number of hits the next day. Also I have groups or sections of pages that have dropped of the index, for years now. Currently I have 1,946 URLs submitted and 1,875 URLs included. That three month "down-time" is also the same amount of time it takes a page to get a Google Page Rank. [if it ever gets one].
2N3485 Transistor Derating. 3 year old page with zero page rank.
Semiconductor Manufacturers 'L'. 5 year old page with zero page rank.
Electron Tube Classification 2 week old page with 0 page rank.
74L121 Monostable Multivibrator IC. 1 week old page with 0 page rank.
Note always run your site-map generator program at night or when you expect low incoming traffic. Remember that program is talking to your server, which would be in direct competition with your visitors.
Also I have no idea how to determine which pages of my site are not included in the web index, other wise I might fix those pages.
The number of web pages; URLs in web index change every week or so, the numbers listed above are just the ones I wrote down.
. err click the title to read the comments, if you didn't come to this specific topic. The blog compresses the comments section unless you're on that particular page. The comments are updates to the blog post.
.......
Posted by
Leroy
at
7:57 AM
9
comments
Wednesday, December 01, 2010
Mobile Device Usage Statistics
As part of up dating the FAQ for the Engineering Site, I'm inserting a posting about which mobile devices are accessing the web site. I haven't checked this data type since the last post a year ago, looks like there's been a change. Incoming visits from mobile devices have doubled over this last year. However doubling a small number is still a small number. The Frequency Ask Questions pages don't get many hits, but I still like to keep the information updated. In order to save server bandwidth I redirect some of the FAQ data out to this blog, but it's still an interesting data point for blog readers.
I tried to check for a percentage change year-to-year in the mobile phones using the site, but for what ever reason Google didn't have it [or said it was estimated]. So the only thing I'll show is this years data and which phones are being used the most, but no change from year-to-year. Of course most of the visits [46%] are coming from an iPhone which is understandable.
This graph shows visits, which are increasing. However all the other metrics are worse than normal computer visitors except percent new visits. Meaning most people coming in are new to the site, but they leave faster and view fewer pages than normal computer 'people'.
Posted by
Leroy
at
10:09 PM
0
comments
Friday, November 26, 2010
How to get more incoming links
I'm looking at the Crawl Errors from other web sites pointing to my Engineering Site. There must be 100 external links pointing to my site that are misspelled [some my have */ or other characters]. The incoming links point to about 40 different pages on my site. Many of the links appear to be copies of each other, as in 6 different web sites using the same bad web address. Some of the incoming links are from web forums or blogs which would explain why there are more than one different page using the same bad link. I still get the 404 page hit regardless, but I'll bet most people just click away.
But why would different web pages use invalid links, it just makes their page look inaccurate. In some cases it appears that a page is using my data but not really giving me proper credit, with an invalid link. You would think they would check their links.
Anyway it seems like Google is increasing their web directory, although I can't tell if these pages show up in a Google search. One forum posting was done in 2007, but Google indicates it was found in Nov of 2010 [Google uses the term Discovery Date]. Same thing for another page link, two incoming links were discovered in 2007, but five more bad links [copies] were found in 2010.
------------
Posted by
Leroy
at
11:40 PM
1 comments
Wednesday, November 17, 2010
AOC 2436vh LCD Monitor Review
I purchased a new Dell PC a few weeks ago which I don't want to review for a few more days. However I do want to review the AOC monitor I purchase at the same time. The AOC was a seperate purchased that did not come with the Dell.
The AOC monitor is a 24 inch wide screen, longer than wider. It seems like a nice display but the problem I have with it is the display size. Currently I have almost a 1 inch black border around the display which means I really have a 22 inch monitor and not a 24 inch which I purchased. A blank border as in no information is displayed, so is unusable.
I've tried using the soft keys on the display to correct the display problem, but either the monitor will not fill in the display or I can't figure out the correct key sequence.
AOC has a video [animated] help sequence showing the bottom sequence for a similar monitor, but I just couldn't follow it. To get to the soft screen buttons I need, I have to turn other functions off [I think]. But if I can't figure how to change the screen size in 20 minutes, than my opinion is they got it wrong. For what ever reason the AOC site does not have any help files out under the 2436vh selection.
Now I could have wrote a review out at Best Buy, maybe to help some one else thinking about buying this monitor, but I didn't want to create an account. Or I didn't want tons of e-mail from Best Buy for the rest of my life. There was a comment in their reviews section that mentioned the softkeys, but it was only one out of 5 reviews. There are also a few requests for help out on the web with the same border issue, but as usually there are no answers, just more comments.
I would not recommend buying the AOC 2436vh because I can't figure out how to adjust the screen size. If an electrical engineer can't figure out how to adjust the screen size I wouldn't expect a normal PC user to solve the problem either.
I'm using
Operating System: Windows 7 Home Premium.
Video Card: ATI Radeon HD 5770, with 1 GB GDDR5 video memory [HDMI connection]
Screen Resolution: 1920 x 1080 [which is recommended], Landscape
I also have another 18 inch monitor connected to the same video card [at a different resolution], but that shouldn't matter. I plan to replace the 18" also but not with an AOC monitor. I'm glad I only purchased one monitor with the computer. Maybe next week I'll go out and get a replacement monitor, after I do a bit more research. Also see Companies making PC Monitors.
At least I got a blog post out of this. I'll append an up-date when I get the next monitor. I'll look around for a factory reset also.
About three hours after I wrote this post I changed the monitor resolution and the display filled the screen. Now I had tried that before but only the next lower resolution which the monitor indicated was not support, while this last time I took it down one more notch. Two hours after I got the screen resolution right I received an e-mail from AOC asking if I had tried to reduce the screen resolution, which is good because I had only posted a question to their support a few hour before. Regardless, I still don't like how they do their soft screen controls.
So the reason why the 2436vh hdmi does not fill screen is because of the computer resolution setting. I'm using HDMI for this monitor, and all the seach phrases I've seen include the term HDMI [including searches finding this page].
The monitor defaults to the highest resolution the display will handle, which in turn leads to dead-space around the visible display. The highest resolution is 1920 x 1080. However the Resolution I'm using now is 1680 x 1050, but I'm also using a screen with an aspect ratio of 16:9. So I can't tell if it's a monitor issue, an aspect ratio issue or an HDMI issue.
Thursday, October 14, 2010
cpu Heat Sink Fan Stopped Spinning
So I'm working on the computer last night and the PC just shuts down. At first I figured I had a power glitch, but the TV was still on. When I turned it back on the BIOS indicated my fan was not turning. How did I not notice that, the PC sounds like a plane taking off.
Any way I power off and back on to get it the fan to start again but that didn't work. So I turn the PC on its side and pulled the cover, flicked the fan a few times and tried the power on/off again. I reconnected the fan cable thinking it may be a loose connection, but no luck.
This is the cpu fan that really connects to a brick of a heat sink sitting on the cpu. It took a few tries before I could disconnect the fan from the heatsink, but I did get it off. Because other than banging on the fan I really couldn't do any thing. It's not like I could some how fix a fan, but I should have used that time to write down the model number.
So the next day, the fan is spinning now but I didn't do any thing. The internal case temperature is 92.8F and the room temperature is 80.1F. My office is always hotter than the rest of the house, to much electronics in the room and far away from the AC [which is set at 76F]. The PC has been on for about 15 minutes now, after doing a quick backup to a USB thumb drive, as the last backup was more than three weeks ago.
The weird thing is that I was just looking on line at new PCs yesterday, maybe it is time. However other than moving from DD2 to DDR3 I could find a good reason to upgrade. Now maybe I should start looking for a FAN and at the same time get another 2G of memory ~ why not if I'm popping the case anyway.
Well I didn't write down the fan type, I just wanted the time to do a back-up. But I did note this is an AMD socket 939 PC. I don't really want the Heat-Sink fan combo, I'll just take the fan. Not because of the extra $40 but because I don't want to mess around with the cpu to heat sink connection point. I haven't built a PC from scratch for 15 years. Why build one when I can just pick all the components on-line and have it delivered.
Posted by
Leroy
at
10:13 AM
1 comments
Tuesday, October 05, 2010
Domain Name expired
So my web site went off-line today from around 8am to 5pm. At first I figured it was something I had done because I was adding a few more passwords late last night. But it turns out that my Domain Name had expired.
Now the strange thing is that one of the first things I checked when I could not get my Hosting company to answer their phone. Whois indicated it would expire next year; however I did find an e-mail from last year from my hosting company that it would expire this year ~ So I'm not sure what is up.
What I'm not happy about is that it took about 8 hours for my hosting guy to answer my trouble ticket, their phone was never answered over the entire 8 hours either. I never even received a warning e-mail I was about to have an issue either, which turned out to be an old credit card on file with them.
Their home page indicates that because of all the great stuff they offer and their low price, they do not offer support. What? I've been with these guys for years and nothing is new, except for their lack of support. So it's time to start getting ready to get a new hosting company.
Posted by
Leroy
at
6:15 PM
0
comments
Wednesday, September 01, 2010
Disallow access to server stat files
So after 10 years on the web the spiders have found my server stat files, in this case AWSTATS. This is the program output I use to see data regarding site visitors; number of visits, page views, visit duration, key words and on and on. The program stores the data in large text [txt] files, but outputs the data to me in the form of charts and graphs. In fact the txt files are very large, multi-Mega Byte files, the graphs are some what smaller and more readable.
I would recommend that every web master add a 'Disallow' line to the robots.txt file to stop the web spiders from reading your stat files. In my case the line looks like this; Disallow: /awstats/.
The bottom curve is server bandwidth ~ the increase occurs as my stat files started showing up in my search results. Because the graph is set up to show number of visitors, the bandwidth is normalized. So the 100,000 horizontal bar which indicates 100,000 visits or page views... indicates 10GB of server bandwidth for the bottom curve. The 200,000 line indicates 20GB for bandwidth
I only noticed the server text files showing up in search result about a month ago, because normally I don't need to search my own engineering site ~ right, I wrote it. So I only added a 'block' to the robots file a few weeks ago; however I recommend that you block access now even if you don't have an issue. It only takes a few minutes to add and if you pay for bandwidth or blocked if you exceed bandwidth it may well be worth the time.
If you look back to 2006/2007 you can see that bandwidth tracked unique visits, but by 2008 the gap stated to widen. Two years before Google started to rank pages on down-load speed I had already started to make the web site more efficient.
Unfortunately now the bandwidth data is meaningless, because it only shows these large txt statistic files being downloaded. For example one 2.52MB txt file was downloaded 230 times last month, a 4.11MB file was downloaded 98 times. That's 328 visitors that used the search bar and received bogus results, are they going to come back for a second visit? Really its much worse, before I stopped counting, there were 1,206 people last month who thought that one of those text files was a valid search return.
Monday, August 30, 2010
Download Speed vs Site Performance
I check Google's data on the Engineering Site all the time, again today I rechecked the Site Performance. At least for the last few weeks the time it takes to down-load a page has decreased [always a good thing]. The attached graphic shows the amount of time it takes Google to down-load pages on the site. The numbers don't represent a single page but the average of some number of pages Googlebot tried to read. I don't really think the data is that accurate, because it never takes 4 seconds to download a page, but I read the data because that's what Google is looking at to determine site performance.
There is no way to tell why my site is coming in faster. The server could be working faster, Google pulled fewer pages or Google pulled small pages; who knows? Well I can check and see how many pages Google crawled per day and it's about the same between now and mid June. Google Crawl Stats indicate an average of 673 pages per day over the last few months.
However; I've been working on making the site faster for months, but not really getting anything to work. The basic problem is that any time I update a page and make the HTML text smaller [less code], I add more data making the page larger. So I may make the html code more efficient [decreasing download times], but I add more human readable text increasing the download time.
The current data [below] indicates the average time to download pages on the site takes 2.6 seconds. This is only important because Google rates site by download speed, so it's Search Engine Optimization [SEO].
Related Blog Posts;
Page Download Times [7-15-2010] 3.1 seconds to download
Speed Performance Overview [6-16-2010] 2.8 seconds to download
Web Site Performance [4-22-2010] 3.7 seconds to download
Google now ranks pages by speed [4-14-2010] No speed data.
Website Speed Performance [4-3-2010] 3.7 seconds to load.
Posted by
Leroy
at
10:50 AM
4
comments
Labels: Bandwidth, Google, Search Engine, SEO
Sunday, August 22, 2010
IE 8 takes forever to shut down
I tried looking up reasons on the web, but no luck. Just a bunch of real old IE crashing issues or out of date info.
Anyway here is how the top five browsers are doing on the engineering site: Explorer usage continues to decline, while Firefox seems to have stalled. Both browsers seem to be getting hurt by the increase in Chrome usage. Chrome did not exist before 2009.
1/1/2010 to 8/21/2010
Internet Explorer = 49.08% [800,493 visitors]
Mozilla Firefox = 35.15% [573,338 visitors]
Google Chrome = 8.66% [141,218 visitors]
Posted by
Leroy
at
9:39 PM
0
comments
Wednesday, August 18, 2010
Link Checker Firefox Extension
I went ahead and hand-checked about 20 pages in the OEM Manufacturers section. Now I did just check the entire web site the other day, but there's a catch. Those automated programs do not always find 'bad-links'.
If a site goes bad or out of business and their web site payment becomes due, then ads could pop-up instead of the old engineering information. Seems right, once the payments run out the domain registrar takes over and runs their own ads, and the site appears "good" to the automated software checker. So you still need to hand check links. Any way I only found one 'bad' link which pointed to an ad, out of 20 pages and 2 hours of checking.
I found Link Checker on the web today and I down loaded the program (but have not used it yet). What I have used is the Firefox Browser extension for Link Checker that checks links on a per page basis. Yes it's another automated program, but now if needed I can check a single page with out checking the entire site. Link Checker seemed to work well. I used it to check each of the 'A' pages in the OEM Company section.
While testing it I noticed that it checked all the internal and external links on a page. However when I went to the next page all my internal links went 'green' right away, telling me that the program was not rechecking them ~ which is good. Link Checker uses color to indicate if a link is good or bad, so some what easy to use. I just tried three other pages but all the links are coming up good. Maybe I'll post a comment in a few days if I do find a bad link with the program, or if I load and run the entire .exe version of the program. The best I can get is a Yellow link which I assume is a 'no robots allowed', not checked link.....
So it's not all about adding new pages, sometimes you have to insure that the pages that are already out there are good, or are otherwise not hurting the web site. I also should have said that the Link Checker program is fast, but only as fast as the site it's checking; right, because it has to wait until it gets a response from the external site.
Posted by
Leroy
at
4:13 PM
0
comments