Friday, December 24, 2010

ReadyBoost Compatiblity with Windows 7

ReadyBoost is a procedure for using a USB flash drive to augment system memory in a windows PC. ReadyBoost started with the introduction of Windows XP, and continues with Windows Vista and now Windows 7.

I've already written a Google Knol on a basic introduction to ReadyBoost on Vista so this posting really relates to Windows 7. Read more on the USB Interface, part of an Engineering Web Site.

Basically ReadyBoost allows a user to plug in a USB thumb drive [or other flash memory card] into a USB slot and set that USB drive to act like available system memory [boosting main memory]. I did try using ReadyBoost with Windows Vista on my previous computer but never really noticed any change. However three weeks ago I purchased a new computer, which I'd like to check out.

The new PC is a Dell Studio XPS8100, with an Intel Core i7-870 processor running at 2.93GHz, using
8GB DDR3 SDRAM system memory and 8MB cache. Like any personal computer the Dell uses revision 2.0 of the USB standard. Although there are 3.0 USB flash drives, there do not seem to be any computers supporting revision 3.0 of the USB spec yet.

Anyway I figured I would investigate how Windows 7 handles ReadyBoost. First off I see that Windows 7 does support ReadyBoost, but recommends adding a flash drive having a minimum of twice the system memory [16GBytes in my case].

Doing a quick check at one retailer I find that a 16GB USB thumb drive costs from $22.99 to 79.99. By comparison four 4GB DDR3 memory sticks [16GB] cost $290. So the main memory might be costing more than 3 times that of the same size thumb drive, but it's also operating much faster than the USB drive. So USB and ReadyBoost are a quick and cheap fix to enhance system operation. It doesn't matter, I don't need to purchase more memory for a PC I just received, but I would like to see how ReadyBoost works. I only use 24% of the PCs memory now anyway [indicated by some gadget on the desktop].

The price range variation in thumb drives brings in the next issue Windows 7 had with USB drives. Microsoft indicates that ReadyBoost will only work with "Fast" flash memory, and will not function with "Slow" flash memory. The help file goes on to say that some flash memory devices may contain a combination of both. So USB thumb drives using slow flash memory will not work at all, while 'faster' USB drives may not be able to use all their available memory [the 'slow' portion]. The definition or difference between fast flash memory and slow flash memory alludes me. The difference between fast and slow may explain the wide range in prices between USB drives; that being $29.99 is slow memory and $79.99 fast memory.

Most of the USB drives I just looked at did not indicate any transfer speed, although I do list them from one particular manufacturer below. Having no transfer speed data, and no knowledge of what a 'fast' transfer speed is anyway, indicates that I should only purchase a USB thumb drive that indicates that it is ReadyBoost compatible and maybe even Windows 7 compatible [in case this fast/slow issue is new to Windows 7].

Of the four different 16GB thumb drives available from one particular store, only one indicates that it supports ReadyBoost. Others do indicate that they perform fast transfers but it seems more of a comparison to USB version 1 than any indication of true transfer speed. Looks like I'll be ordering a 16GB USB 2.0 Flash Drive from PNY in the next few days to test out ReadyBoost.

USB 2.0 Transfer Speeds:
Read Speeds; 10MB/sec., 24MB/sec., 25MB/sec., 30MB/sec.
Write Speeds; 5MB/sec, 8MB/sec., 10MB/sec., 20MB/sec. 

Note that the USB standard does use terms like Slow-speed, Full-speed and High-speed, but how many people read a technical specification? Regardless, how do the terms used in the USB spec relate to fast and slow used by Microsoft?

I'll up-date the post when I receive the new thumb drive, lets hope I don't lose it at the rate I'm losing all my other thumb drives.

Saturday, December 18, 2010

What Video Card Should I get

As of 2010 there are currently 4 major video interfaces used on either computers, PC monitors, televisions or both. The oldest is the VGA interface which started to appear on PCs in 1987.

The VGA interface has been upgraded a number of times and is now called the SVGA interface, although everyone still uses the generic term VGA to describe the interface [see note]. Even with its age and the fact that its the only analog interface left on a PC, it can still be found on the latest monitors and TVs for compatibility.


The search trend below tells the story, search interest in the term VGA [orange] has only dropped off slightly in the last 6 years. Now, there is no way to tell because the data is normalized, but the drop represents a large reduction in the number of searches.




The DVI connector is the second oldest video interface. The DVI was introduced in 1990 as a replacement to the analog VGA interface and was capable of both analog or digital operation. However the introduction of HDMI made the adaption fall off in the last few years. In fact the organization that developed the DVI specification disbanded in 2006. The graph shows a drop in DVI interest [Red], falling at about the same rate as the VGA interface. But at this point I think every body knows what a VGA interface is, its been twenty years.


The two interfaces showing an increase in searches are the HDMI output and the DisplayPort interface. From the graph, interest in HDMI appears to be growing faster the disinterest in DVI or VGA; but that make sense with so many fielded systems using either VGA or HDMI. The large spikes represent increased searches, probably due to news articles or new products being introduced.


The blue line down at the bottom of the graph represents Google searches for the term DisplayPort. It may appear that there is no interest in the new DisplayPort video interface, but that is only when compared to the vast number of searches being conducted for the other video card types. If you happen to zoom in, or re-normalize the graph with out the other video interfaces than the increase becomes apparent [shown below].

 Interest in Displayport has doubled over the last few years, up four fold from its release date.

Many times when a interface standard is released it takes another year before products begin to hit the market. In the case of a video standard you need at least two different manufacturers producing products; one selling a video card and a different manufacturer selling a computer monitor. Then there's the issue of demand; a company making PC monitors may not want to go to the expense of designing a new interface when the mating interface is not yet available on a video card. So in some cases it may take a few years for a new video standard to take hold.

For example DisplayPort may have been released in 2007, but video cards with a DisplayPort interface may not have appear until 2008 followed sometime later by computer monitors.

Anyway the best video monitor interface to use is DisplayPort, or HDMI on a TV. They're both digital interfaces, but HDMI is more common than DisplayPort.

Note; The search trend for the term VGA shows a dramatic drop in search usage from 2004 to 2007 and a steady decline onward. But when compared to the term SVGA, SVGA does not even register on the same graph. In other words the term VGA is being searched for 20 to 25 times more than the term SVGA.

In general I keep my PC for around two years and the monitors for about four years. The two AOC 2436 monitors I'm currently using [AOC 2436-vh review] has both a HDMI and VGA interfaces. So they don't have the newest DisplayPort interface or the out-dated DVI interface, but a good  mix of both analog and digital connectors. My Dell SX8100 computer uses an ATI Radeon HD 5770 video card with dual-link DVI, DisplayPort and HDMI outputs. Note the lack of a VGA interface
.

Tuesday, December 14, 2010

Browser Compatibility

2010 Browser Usage
For years now I figured that most issues between the different web browsers had stabilized, or at least worked fine with my web pages. The html code used with the Engineering Web Site is pretty basic, nothing special, but there was a time when I checked the pages between the different internet browsers.

Primarily I would be looking for layout changes between the different browsers, small changes in spacing or page breaks. As a rule the pages were always working between the different programs. But all that was a long time ago, I stopped checking page issues half a dozen years ago. Back then I would load all the different web browsers on the PC and check pages for compatibility.

Now I just run the three major browsers, Internet Explorer, Firefox and Chrome. However I don't run them looking for problems, I run them with each one opening different windows and or passwords.

Anyway over the last few months I've noticed page layout issues with my unique 404 page [the page that displays on a mistyped web address]. I sort of looked at it but never really did any thing about it because I couldn't find the problem. Now I realize that the layout problem was only showing up in Google Chrome, which I hadn't noticed before. With three different browsers it's kind of random which pages show up in which browser depending on what I'm working on.

So I have three different style of pages, and two of those styles use the HTML 'DIV' tag which puts a column on either the right or left side of the page. The pages that use the DIV tag to put a column on the right side of the page seems to work in all three browsers. However the pages that use a DIV to add a left hand column don't seem to work in Google Chrome. Any text or graphic that uses a 'center' tag is forced out of the column and below the content of the main section. So left margin text remains in the div column while a 'center' graphic is pushed out into the adjacent column.

Lucky for me, the pages with the issue only account for 1% of the page-views or around 37,000 pageviews. Ten percent of that are people using Google Chrome viewing only 370 pages in these sections. The 14 pages that account for most of the issue [0.67%] were just fixed as well as some of the others. The thing is, if you never scrolled down to the bottom of the page, the layout problem might not have even been noticed.

Oh, the fix is to just remove the center tag around the graphics in the left column. That was the quick fix, there must be an html coding issue with the div tags, but the point is to get them working right now....

So the advice would be to check you pages in all the browsers.

The pie chart represent 4,221,889 pageviews year-to-date for the web site.
239,690 of those page views were from people using Google Chrome.
.

Tuesday, December 07, 2010

Google Sites Free Web Page

V22 Osprey
I don't often blog about how my Google Sites page is doing, but as part of an end-of-year thing I'll go ahead and add a few words. I call the site interfacebus, just like the main site. Google sites is free, so I can't really have to many complaints, but it's a bit hard to organize the pages by topic.

Each sub-page has the address of the upper level pages so the URLs get pretty long. Or you really need to plan ahead when generating new pages. However I don't know what page I'm going to add at any given time. So some of the pages addresses are really long. I guess it doesn't matter if a person comes to the site via a search engine, or if their navigating the site via the site-map.

The graphic is part of a report from Google Analytics, like many of the graphs in this blog, comparing last years hits or visits to this year. This year the site has had 14,532 visits, and 20,903 pageviews, over 165 different URLs. However only 133 pages are included in Google index.

Many of the pages really just hold some large graphic that didn't fit on the main site, but in most cases there is a small amount of text. I don't add pages as often as on the other site, but much of the increase in visits is due to new pages, or maybe new pictures.
.

Saturday, December 04, 2010

Increasing Page Views

I don't really track the number, but I add around two or three new pages to the Engineering Site each month.

Of course I would hope that the new page(s) would generate more incoming visitors, but there's little chance of that. Remember it takes three months just to get a page rank and start to show up in the Search Engine Pages [SEP]. Maybe another month before that until Google spiders and reads the page. Or any page I generated in the last half of the year had no chance of receiving any page views.

So the table below compares pageviews from this year to last year. It might seem like that there's not much of a change, but most pages increased in page views. However many pages still hover around the first three rows. So many of the pages I added only received just a few page views, with almost no page views during the first three months.

However there's another way to look at the situation. A new page may bring in a new visitor, which might revisit the site.


The increase in pages in the first row are new pages. Most of the rest of the pages are stuck there.
There are 50 pages that make up the 6 year old Index of Semiconductor Manufacturers section that refuse to get a Page Rank, no matter how many times I link to them. Most of the 3 year old 115 pages that make up the Transistor Derating Curves section also have never received a Page Rank, regardless of what I've done.

I just updated a 3 year old page that only received 8 page views this year. I pulled a 90k pic file off Picasa, reduced it to 50k and stored it on my server. I deleted another Picasa graphic altogether because it could be found by an on-page link. I also changed a few important key words. The point was not to get more pages views, its transistor part number [page topic] must be obsolete so the page will never receive any page views.

Instead what I did was make the page look less like the other 115 similar pages in that section, and make it load faster [for Google] to keep them happy. Files that are stored external to the site require a DNS look-up, which Google thinks slows down a site ~ part of Google's rating systems measures page loading speed. The 50k file on my server should not hurt because the page is never viewed, except by the web spiders. So having this page appear different may help the pages getting page-views, because one less page looks the same.

Anyway if a page received 1,000 page-views last year it would have to see over a 900% increase to move up to the next row. So understandably most pages stay in the same row they were found in last year, even if they had a 100% increase in views.


The site is currently receiving 7.21% more Pageviews than last year, with 3,978,019 current pageviews. So the site has already passed the number of page views received last year. The table above is Pageviews, the chart to the left is number of visits.

The program that generates the data is Google Analytics, and the report is Top Data, order from highest pageviews down [which I then just count]. Using Google Analytics requires a small amount of Java code on each of your web pages.

Many of the pages in my report with only one page view for the year are really just mis-spelled page addresses. The viewer just sees the sites 404, page not found, but the report records it as the address that the server received. Many types of 'wrong addresses can be filtered, but many can't, sometimes I can't even tell if it's a real page address with out pulling up the page. Opening the page will give it another count which could cause it to look like a real page address when I check it later. So the first row of data is an estimate, because I started to get down to the misspelled pages in the report.

This is a long post just to say that adding new pages or worrying about getting those pages included in the Google index may not help the over-all web site. You still need to come up with content that people are looking for.
.

Friday, December 03, 2010

Internet Browser Usage

Being the end of the year I figured I would post which browser visitors use for the engineering data base. Some people watch these numbers, and even small changes make one of the technical papers. But like me, once in a while they probably just need something to post about.

I could care less which browser is doing better than another, because I use all three at once. I have some pages open in Firefox, others in Explorer and still others in Google Chrome. I just wish they would put the buttons and options in the same place in each of the programs.

The graphic shows the steady decline in Internet Explorer usage over the last four years, and the steady increase in Firefox usage. Although Firefox has only increased less than 1% from 2009. It would appear that Google Chrome is taking the steam out of Firefox's usage, increasing 6% over the last year. I assume 6% Firefox would have gotten if not for Chrome. Safari has also passed Opera usage for the first time, but only by a half percent.

I don't use either Opera of Safari, but years ago I did try Opera for a while. But like I said, as soon as the PC boots up, the other three browsers open.... once. The data for 2010 account for 3,094865 unique visits so it's a big data sample. Counting the return visitors would make the data sample larger, but what's the point, it would just make the browsers on the increase look better and the ones on the decline look worse.

There are other browsers being used to access the web site but their percent usage are all below 1% ~ Below 20,000 web visits, or two days of web visits. The site receives about 9,000 hits per day.
.
.

Thursday, December 02, 2010

URLs in Web Index

This would be the third revision or up-date to the posting of the web stats showing URLs in Web Index.
The stats relate to this Engineering Web Portal.

The table of data below shows the number URLs or pages in Google's web index. First of all I don't like to generate new site maps that often because of the bandwidth requirements of the sitemap program. That is the program I run needs to check every page I have on the server, so it's a big hit to my server. But if you look at the data and the increasing number of URLs in the web index you will notice that generating a sitemap over and over again is not required.

However I do recommend running a new sitemap if you have a large amount of new pages. The number of URLs in the web index always increases after a site-map is generated. However you'll notice that in some cases the number of indexed pages falls off a few days later. I assume the program reading the sitemap finds the new pages and adds them to the web index, but sometime later another computer algorithm determines the page should not be index and they drop off the index.

01/21/11 = 1,938 urls in web index [Site-map loaded]
11/30/10 = 1,875 urls in web index.
11/04/10 = 1,854 urls in web index.
10/23/10 = 1,805 urls in web index.
07/24/10 = 1,779 indexed pages.  [Site-map loaded]
07/01/10 = 1,535 indexed pages.  [Site-map loaded]
06/30/10 = 1,504
urls in web index.
06/24/10 = 1,455 urls in web index.
06/18/10 = 1,426 urls in web index.
05/31/10 = 1,400 urls in web index.
05/22/10 = 1,394 urls in web index.
04/07/10 = 1,309 urls in web index.
03/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]


There are two reasons why the number of URLs in the web index are increasing. First I generate at least two or three new pages every month, sometimes many more. So you would except to see an increasing number of pages included in the web index as those new pages are found and included. Secondly I'm always updating preexisting pages already residing on the web-site. So after time the up-dates may allow a page to be indexed, usually because it has more content [text]. Many times I'll add a new page topic that is little more than a graphic and a small description. But over time I get back to up-dating the page to include more data and so on. Once the page gets 'the required amount' of content Google included the URL into the web index.

How soon your new page shows up in the web index, with out a sitemap, depends on how often your pages are crawled. The chart shows that my site has 500 pages crawled every day, although you can't tell from that how many pages are re-crawled. So for my site any new page added is found within a few days, I assume 15 days ~ so I don't really need a sitemap.


Although it has little to do with the number of URLs in the web index, over and over I see new pages not receiving any incoming hits for months. That is even as a new page gets included in the web index, it takes three months for the page to really start getting any visits. Web Masters call that the Google sand box. So don't think that just because a page gets indexed that it will bring in a larger number of hits the next day. Also I have groups or sections of pages that have dropped of the index, for years now. Currently I have 1,946 URLs submitted and 1,875 URLs included. That three month "down-time" is also the same amount of time it takes a page to get a Google Page Rank. [if it ever gets one].

2N3485 Transistor Derating. 3 year old page with zero page rank.
Semiconductor Manufacturers 'L'. 5 year old page with zero page rank.
Electron Tube Classification 2 week old page with 0 page rank.
74L121 Monostable Multivibrator IC. 1 week old page with 0 page rank.

Note always run your site-map generator program at night or when you expect low incoming traffic. Remember that program is talking to your server, which would be in direct competition with your visitors.

Also I have no idea how to determine which pages of my site are not included in the web index, other wise I might fix those pages.

The number of web pages; URLs in web index change every week or so, the numbers listed above are just the ones I wrote down.
. err click the title to read the comments, if you didn't come to this specific topic. The blog compresses the comments section unless you're on that particular page. The comments are updates to the blog post.
.......

Wednesday, December 01, 2010

Mobile Device Usage Statistics

As part of up dating the FAQ for the Engineering Site, I'm inserting a posting about which mobile devices are accessing the web site. I haven't checked this data type since the last post a year ago, looks like there's been a change. Incoming visits from mobile devices have doubled over this last year. However doubling a small number is still a small number. The Frequency Ask Questions pages don't get many hits, but I still like to keep the information updated. In order to save server bandwidth I redirect some of the FAQ data out to this blog, but it's still an interesting data point for blog readers.

I tried to check for a percentage change year-to-year in the mobile phones using the site, but for what ever reason Google didn't have it [or said it was estimated]. So the only thing I'll show is this years data and which phones are being used the most, but no change from year-to-year. Of course most of the visits [46%] are coming from an iPhone which is understandable.

A few years back there was a particular way a web page had to be set up to work on a mobile device ~ I assume that's not required any longer with the newer systems and bigger screens. Any way it doesn't matter I don't have time to rewrite 2,000 web pages just for 3,000 hits a year. The site receives 9,000 a day from people on a normal computer. But it is nice to see an increase in any of these reports I pull up, makes me think I'm getting some place.

This graph shows visits, which are increasing. However all the other metrics are worse than normal computer visitors except percent new visits. Meaning most people coming in are new to the site, but they leave faster and view fewer pages than normal computer 'people'.

Friday, November 26, 2010

How to get more incoming links

I'm looking at the Crawl Errors from other web sites pointing to my Engineering Site. There must be 100 external links pointing to my site that are misspelled [some my have */ or other characters]. The incoming links point to about 40 different pages on my site. Many of the links appear to be copies of each other, as in 6 different web sites using the same bad web address. Some of the incoming links are from web forums or blogs which would explain why there are more than one different page using the same bad link. I still get the 404 page hit regardless, but I'll bet most people just click away.

But why would different web pages use invalid links, it just makes their page look inaccurate. In some cases it appears that a page is using my data but not really giving me proper credit, with an invalid link. You would think they would check their links.

Anyway it seems like Google is increasing their web directory, although I can't tell if these pages show up in a Google search. One forum posting was done in 2007, but Google indicates it was found in Nov of 2010 [Google uses the term Discovery Date]. Same thing for another page link, two incoming links were discovered in 2007, but five more bad links [copies] were found in 2010.
------------

Wednesday, November 17, 2010

AOC 2436vh LCD Monitor Review

I purchased a new Dell PC a few weeks ago which I don't want to review for a few more days. However I do want to review the AOC monitor I purchase at the same time. The AOC was a seperate purchased that did not come with the Dell.

The AOC monitor is a 24 inch wide screen, longer than wider. It seems like a nice display but the problem I have with it is the display size. Currently I have almost a 1 inch black border around the display which means I really have a 22 inch monitor and not a 24 inch which I purchased. A blank border as in no information is displayed, so is unusable.

I've tried using the soft keys on the display to correct the display problem, but either the monitor will not fill in the display or I can't figure out the correct key sequence.

AOC has a video [animated] help sequence showing the bottom sequence for a similar monitor, but I just couldn't follow it. To get to the soft screen buttons I need, I have to turn other functions off [I think]. But if I can't figure how to change the screen size in 20 minutes, than my opinion is they got it wrong. For what ever reason the AOC site does not have any help files out under the 2436vh selection.

Now I could have wrote a review out at Best Buy, maybe to help some one else thinking about buying this monitor, but I didn't want to create an account. Or I didn't want tons of e-mail from Best Buy for the rest of my life. There was a comment in their reviews section that mentioned the softkeys, but it was only one out of 5 reviews. There are also a few requests for help out on the web with the same border issue, but as usually there are no answers, just more comments.

I would not recommend buying the AOC 2436vh because I can't figure out how to adjust the screen size. If an electrical engineer can't figure out how to adjust the screen size I wouldn't expect a normal PC user to solve the problem either.

I'm using
Operating System: Windows 7 Home Premium.
Video Card: ATI Radeon HD 5770, with 1 GB GDDR5 video memory [HDMI connection]
Screen Resolution: 1920 x 1080 [which is recommended], Landscape

I also have another 18 inch monitor connected to the same video card [at a different resolution], but that shouldn't matter. I plan to replace the 18" also but not with an AOC monitor. I'm glad I only purchased one monitor with the computer. Maybe next week I'll go out and get a replacement monitor, after I do a bit more research. Also see Companies making PC Monitors.

At least I got a blog post out of this. I'll append an up-date when I get the next monitor. I'll look around for a factory reset also.

About three hours after I wrote this post I changed the monitor resolution and the display filled the screen. Now I had tried that before but only the next lower resolution which the monitor indicated was not support, while this last time I took it down one more notch. Two hours after I got the screen resolution right I received an e-mail from AOC asking if I had tried to reduce the screen resolution, which is good because I had only posted a question to their support a few hour before. Regardless, I still don't like how they do their soft screen controls.

So the reason why the 2436vh hdmi does not fill screen is because of the computer resolution setting. I'm using HDMI for this monitor, and all the seach phrases I've seen include the term HDMI [including searches finding this page].

The monitor defaults to the highest resolution the display will handle, which in turn leads to dead-space around the visible display. The highest resolution is 1920 x 1080. However the Resolution I'm using now is 1680 x 1050, but I'm also using a screen with an aspect ratio of 16:9. So I can't tell if it's a monitor issue, an aspect ratio issue or an HDMI issue.

Thursday, October 14, 2010

cpu Heat Sink Fan Stopped Spinning

So I'm working on the computer last night and the PC just shuts down. At first I figured I had a power glitch, but the TV was still on. When I turned it back on the BIOS indicated my fan was not turning. How did I not notice that, the PC sounds like a plane taking off.

Any way I power off and back on to get it the fan to start again but that didn't work. So I turn the PC on its side and pulled the cover, flicked the fan a few times and tried the power on/off again. I reconnected the fan cable thinking it may be a loose connection, but no luck.

This is the cpu fan that really connects to a brick of a heat sink sitting on the cpu. It took a few tries before I could disconnect the fan from the heatsink, but I did get it off. Because other than banging on the fan I really couldn't do any thing. It's not like I could some how fix a fan, but I should have used that time to write down the model number.

So the next day, the fan is spinning now but I didn't do any thing. The internal case temperature is 92.8F and the room temperature is 80.1F. My office is always hotter than the rest of the house, to much electronics in the room and far away from the AC [which is set at 76F]. The PC has been on for about 15 minutes now, after doing a quick backup to a USB thumb drive, as the last backup was more than three weeks ago.

The weird thing is that I was just looking on line at new PCs yesterday, maybe it is time. However other than moving from DD2 to DDR3 I could find a good reason to upgrade. Now maybe I should start looking for a FAN and at the same time get another 2G of memory ~ why not if I'm popping the case anyway.

Well I didn't write down the fan type, I just wanted the time to do a back-up. But I did note this is an AMD socket 939 PC. I don't really want the Heat-Sink fan combo, I'll just take the fan. Not because of the extra $40 but because I don't want to mess around with the cpu to heat sink connection point. I haven't built a PC from scratch for 15 years. Why build one when I can just pick all the components on-line and have it delivered.

Tuesday, October 05, 2010

Domain Name expired

So my web site went off-line today from around 8am to 5pm. At first I figured it was something I had done because I was adding a few more passwords late last night. But it turns out that my Domain Name had expired.

Now the strange thing is that one of the first things I checked when I could not get my Hosting company to answer their phone. Whois indicated it would expire next year; however I did find an e-mail from last year from my hosting company that it would expire this year ~ So I'm not sure what is up.

What I'm not happy about is that it took about 8 hours for my hosting guy to answer my trouble ticket, their phone was never answered over the entire 8 hours either. I never even received a warning e-mail I was about to have an issue either, which turned out to be an old credit card on file with them.

Their home page indicates that because of all the great stuff they offer and their low price, they do not offer support. What? I've been with these guys for years and nothing is new, except for their lack of support. So it's time to start getting ready to get a new hosting company.

Wednesday, September 01, 2010

Disallow access to server stat files

So after 10 years on the web the spiders have found my server stat files, in this case AWSTATS. This is the program output I use to see data regarding site visitors; number of visits, page views, visit duration, key words and on and on. The program stores the data in large text [txt] files, but outputs the data to me in the form of charts and graphs. In fact the txt files are very large, multi-Mega Byte files, the graphs are some what smaller and more readable.

I would recommend that every web master add a 'Disallow' line to the robots.txt file to stop the web spiders from reading your stat files. In my case the line looks like this; Disallow: /awstats/.

The bottom curve is server bandwidth ~ the increase occurs as my stat files started showing up in my search results. Because the graph is set up to show number of visitors, the bandwidth is normalized. So the 100,000 horizontal bar which indicates 100,000 visits or page views... indicates 10GB of server bandwidth for the bottom curve. The 200,000 line indicates 20GB for bandwidth

I only noticed the server text files showing up in search result about a month ago, because normally I don't need to search my own engineering site ~ right, I wrote it. So I only added a 'block' to the robots file a few weeks ago; however I recommend that you block access now even if you don't have an issue. It only takes a few minutes to add and if you pay for bandwidth or blocked if you exceed bandwidth it may well be worth the time.

If you look back to 2006/2007 you can see that bandwidth tracked unique visits, but by 2008 the gap stated to widen. Two years before Google started to rank pages on down-load speed I had already started to make the web site more efficient.

Unfortunately now the bandwidth data is meaningless, because it only shows these large txt statistic files being downloaded. For example one 2.52MB txt file was downloaded 230 times last month, a 4.11MB file was downloaded 98 times. That's 328 visitors that used the search bar and received bogus results, are they going to come back for a second visit? Really its much worse, before I stopped counting, there were 1,206 people last month who thought that one of those text files was a valid search return.

Monday, August 30, 2010

Download Speed vs Site Performance

I check Google's data on the Engineering Site all the time, again today I rechecked the Site Performance. At least for the last few weeks the time it takes to down-load a page has decreased [always a good thing]. The attached graphic shows the amount of time it takes Google to down-load pages on the site. The numbers don't represent a single page but the average of some number of pages Googlebot tried to read. I don't really think the data is that accurate, because it never takes 4 seconds to download a page, but I read the data because that's what Google is looking at to determine site performance.

There is no way to tell why my site is coming in faster. The server could be working faster, Google pulled fewer pages or Google pulled small pages; who knows? Well I can check and see how many pages Google crawled per day and it's about the same between now and mid June. Google Crawl Stats indicate an average of 673 pages per day over the last few months.

However; I've been working on making the site faster for months, but not really getting anything to work. The basic problem is that any time I update a page and make the HTML text smaller [less code], I add more data making the page larger. So I may make the html code more efficient [decreasing download times], but I add more human readable text increasing the download time.

The current data [below] indicates the average time to download pages on the site takes 2.6 seconds. This is only important because Google rates site by download speed, so it's Search Engine Optimization [SEO].

Related Blog Posts;
Page Download Times [7-15-2010] 3.1 seconds to download
Speed Performance Overview [6-16-2010] 2.8 seconds to download
Web Site Performance [4-22-2010] 3.7 seconds to download
Google now ranks pages by speed [4-14-2010] No speed data.
Website Speed Performance [4-3-2010] 3.7 seconds to load.

Sunday, August 22, 2010

IE 8 takes forever to shut down

I'm not really sure why but Internet Explorer takes a long time to close down. When I close down a single tab, the tab closes right away. However when I close down the program it takes forever to shut down, with the first few tabs taking the longest amount of time. I always have three browsers open; IE, Firefox and Chrome.

I tried looking up reasons on the web, but no luck. Just a bunch of real old IE crashing issues or out of date info.

Anyway here is how the top five browsers are doing on the engineering site: Explorer usage continues to decline, while Firefox seems to have stalled. Both browsers seem to be getting hurt by the increase in Chrome usage. Chrome did not exist before 2009.



1/1/2010 to 8/21/2010
Internet Explorer = 49.08%  [800,493 visitors]
Mozilla Firefox = 35.15% [573,338 visitors]
Google Chrome = 8.66% [141,218 visitors]

Wednesday, August 18, 2010

Link Checker Firefox Extension

I went ahead and hand-checked about 20 pages in the OEM Manufacturers section. Now I did just check the entire web site the other day, but there's a catch. Those automated programs do not always find 'bad-links'.

If a site goes bad or out of business and their web site payment becomes due, then ads could pop-up instead of the old engineering information. Seems right, once the payments run out the domain registrar takes over and runs their own ads, and the site appears "good" to the automated software checker. So you still need to hand check links. Any way I only found one 'bad' link which pointed to an ad, out of 20 pages and 2 hours of checking.

I found Link Checker on the web today and I down loaded the program (but have not used it yet). What I have used is the Firefox Browser extension for Link Checker that checks links on a per page basis. Yes it's another automated program, but now if needed I can check a single page with out checking the entire site. Link Checker seemed to work well. I used it to check each of the 'A' pages in the OEM Company section.

While testing it I noticed that it checked all the internal and external links on a page. However when I went to the next page all my internal links went 'green' right away, telling me that the program was not rechecking them ~ which is good. Link Checker uses color to indicate if a link is good or bad, so some what easy to use. I just tried three other pages but all the links are coming up good. Maybe I'll post a comment in a few days if I do find a bad link with the program, or if I load and run the entire .exe version of the program. The best I can get is a Yellow link which I assume is a 'no robots allowed', not checked link.....

So it's not all about adding new pages, sometimes you have to insure that the pages that are already out there are good, or are otherwise not hurting the web site. I also should have said that the Link Checker program is fast, but only as fast as the site it's checking; right, because it has to wait until it gets a response from the external site.

Sunday, August 15, 2010

How often should site links be checked

I normally try to check external links on the site every 3 months or so. I'd like to check more often but I just see it as a severe hit to my server, because the link checker program has to read every page. At the same time it's a lot of work to use this program because at any given time there are always a few dozen websites off-line for what ever reason, so I have to recheck those. Of course how many companies could go out of business or change hands in a three month time period.

Any way this time around there were about 20 bad links including 3 from me, pointing to a misspelled address on my own site. I still have a few more links in my to-do box, waiting a few days to see if they come back on-line. The attached graphic shows the data produced by the link checker program. Over 99.5% of the links were good, but I didn't track all the link conditions. The report indicates I have 7,815 external links and 1,917 internal [page-to-page] links; because there are 1,917 pages that make up the web site.

So if you have external links it's always a good idea to keep them up to date. You never know who buys up the link once its gone bad. Search engines don't like broken links and of course visitors never like bad links, they just make the site seem abandoned. Oh I should also say I tried using the program over the last few week but just kept getting to many broken internal links in the report making the report impossible to use ~ the program was over-loading my server. The program and server worked ok today....

At the same time I fixed those bad links I also made what ever updates were required on the 20 pages that were 'fixed', making the process take twice as long as it needed.

Click to enlarge the graphic.

Tuesday, August 10, 2010

Computer Generated Pages

So I don't really know how many engineers read this blog, but I really need to comment on all these auto generated pages on the internet. Any time I try to look up a part on the internet all the search results that come back are just PC generated pages with nothing more than 100's of part numbers. Now a few of the pages do have a data sheet but they make you click through three pages to find the pdf version. So it's a bit time consuming to find a data sheet for a part that's less than common.

Up until a few months ago a search on Google would let you select the bogus site so they would not reappear in the search results. So by the next search I wouldn't let many of these sites show up. However instead of the de-select function Google changed that to a 'vote-for' star. When I'm looking for these components I don't see many sites to vote for. I just tested out Yahoo and they did a bit better, but not by much.If these guys would just add some content they wouldn't need 10,000 auto generated pages with every part number known to man....

Anyway I vented ~ I was looking for a Zener diode LVA62 which was on page 1 of Yahoo and 4 or 5 on Google. But every week I'm looking up part numbers....

I should also say this occurs much more often when looking for a part that may no longer be second sourced or in production decline, a part that is no longer in full production. I don't really have the same problem with newer parts, never if I know the current manufacturer ~ I can just go to the OEM's site and get the data sheet from there.

Wednesday, August 04, 2010

Digital Living Network Alliance Products

I mentioned this DLNA certified BluRay player yesterday, thinking I might need it. I'm sure I would like to have DLNA but that may be years away. Because except for the BluRay player which I haven't even purchased yet no other gear that I have is compatible and I see no need to change any thing out until they break. Wireless stuff would be nice but I don't really think I would ever use any of it. I gave away the spare PC I had connected to the HDTV because I never used it, that was wireless too, but not to the TV. I assume the BluRay player would be called a Digital Media Player [DMP].

I would like a new PC but I've been waiting to figure out what they were going to do with USB 3.0 and for PCI-Express 3.0 to be released So I'm about a year away from getting a new PC, but maybe I don't need one [New PC Posting 4/23/10]. The Digital Living Network Alliance indicates that I would need a DLNA certified Network Attached Storage [NAS] device, which I could get instead of a new PC (I guess). I'm not really sure why I would need a network drive when a DLNA certified PC should work, I assume. I'm not really sure if a NAS is the same as a DMS [Digital Media Server] [HDD Vendors].

So the 2 year old 47" flat screen I just blogged about the other day does not appear to be compatible. The 53" or 57" floor model out in the front room has to be 10 years old and the 37" CRT in the back room is even older so this whole idea of caring about DLNA is out the window. I could be years away from getting a new TV.

So it would appear this blog posting is pointless, unless it helps someone like me to understand that unless their buying a whole new system DLNA is just not required.

Monday, August 02, 2010

DVD Player No Disk

I just tried out my KLH 221 DVD player after years of having the thing sitting in my closet. The company is either called KLK Audio or KLH Audio Systems, but I'm not sure of the correct name. Any way I haven't used the thing for about 5 years, and I have no idea when I purchased the player. I had the DVD player setup in a back room that I never used, so it never got a lot of play time, but it was working several years ago.

I tried 3 or 4 DVDs in it yesterday but each time the LCD screen would just display 'No disk', which I assume means it could not read the DVD. However now that I'm thinking about it I don't recall the thing even spinning up. I tried rebooting a few times by pulling the plug but nothing seemed to work. Maybe I'll try it out one more time. Now I did try a quick web search for the model KLH 221, but I couldn't find much or a manual. Most of the web posting were just 'trash' sites with the same re-posts over and over. It could be that this DVD player is almost ten years old, I think it was the first one I ever purchased. Could it be because the CMOS battery went bad and the player no longer remembers how to work, maybe I need some reset code? Oh the remote was still working, that was a surprise.

I do have another DVD player in the front room that I've wanted to replace with a BluRay player. So I guess it's time for a new BluRay for the front room and moving that DVD player into the back room, replacing the one that no longer works. I would have rather purchased a new BluRay in my own time and not because some other system failed. I already have a BluRay in my office, and I may just re-purchase another new Sony BluRay player. I like Sony gear for the front room because that would match the other Sony gear and should work with the Sony remote I use out there.

I see a few reviews that the newer BluRays are loading the disk faster, it's about time. I also see some that have DLNA capability, I guess I want that. I'll have to look around for a few more days.


Just in case the chart in the previous post is a bit busy or hard to read I show the same data a different way. It should be a little easier to see that the site is still doing better than any previous year, but down these last few months because that's the mid-year trend. Of course the visits are way down, but I've always found that the higher the hits the bigger the drops. Maybe one more month before hits start to increase again. Click the image for a larger view of the data.

Sunday, August 01, 2010

Server Bandwidth increasing

Even as visitors to the web site have been decreasing over the last few months bandwidth has increased. Now it's normal for site hits to decrease during this time of the year, normal for my site.

In March there were 190,622 unique visitors to the site using 14.64G Bytes of bandwidth.
In June there were 156,355 unique visitors to the site using 16.15G Bytes of bandwidth.
In July there were 150,100 unique visitors to the site using 17.03G Bytes of bandwidth.

Checking Googlebot Crawl Stats the amount of data downloaded did double for the end of July, but only to a high of 28kBytes per day. However the number of pages crawled stayed about average at 812 pages per day. So the only assumption that can be made is that Google was reading pages with more pictures than normal. Some pages have more graphics than normal; also, some of the graphic files are local to the server and some are held out on Picasa. Of course the ones located on Google Picasa do not effect my bandwidth.

Now I have removed a few dozen pic files from Picasa over the last few months. Google started to rank web sites based on down-load speed and they considered getting the pic files from 'Google' Picasa as slow. Not because Picasa is slow, although it maybe, but because they consider looking up the DNS [address] of another web site as being inherently slow. The most relevant posting was Web Site Speed Enhancements.

Last month I did up-load a new site-map to the server which Google has been reading every few days. The site map is 300k Bytes which is the size of about 60 html files, or maybe 30 files if you were to count the pic files too. The reason for up-loading the sitemap was to try and get more pages included in Google's index, which I have but maybe at the cost of server bandwidth. June 30 had 1,504 pages in Google's search index [URL's in Web Index], and as of July 28 there were 1,782 files included in Google's Index.

The server counter AWSTATS indicates that Googlebot used 285.97MB of server bandwidth, and the spider from Yahoo used 227.35MB of bandwidth. I have a [BaiDuSpider] spider from China that used 681.53MBytes of bandwidth. I noticed that a number of internet posts have had issues with the amount of data being indexed by this particular spider. It would appear that over the previous few months BaiDuSpider was only reading about 15MB/months so maybe it just got around to reading the entire web-site.

Now checking Google Analytics I see no real increase in visits from China, a little over 3,000 a month for the last three months. The top 10 robots used almost 3G Bytes of bandwidth, it's to bad their not sending me more traffic.

Even Alexa used over 329MBytes to spider my site, to bad my traffic rank is down 26,000 [-10% Reach] but then that would figure because my hits are down as well.
Oh and I just noticed that Google is also reading my text files from my stats counter, so that's another 20M of data it got off my server.

I guess I should also mention that over the last thirty days I've added maybe 30 pages and maybe that many pic files so that would account for a bit of the increase too. Anyway check out the attached chart, the lowest trend is bandwidth [normalized] ~ click to enlarge.

Saturday, July 31, 2010

Westinghouse TX-47F430S 47"Flat Screen Review

I purchased this 47 inch flat screen a few years ago, and for the most part I like how it works. I did add a post for the TV but never wrote a review for it. Early this year I hocked up a BluRay player and I had a few issues with the TV auto detecting the signal. I ended up turning the TV off and back on until it found the signal, actually I had to unplug the set because just turning it off didn't help. Any way the TV always detects the BluRay now and I just figured it was me not reading the directions, However I did move the BluRay over to the HDMI 1 channel connector. I just found some reviews on the web and other people were having the same issue of not detecting the incoming signals [auto-source switching]. Seems to work fine now.

I like the set, the 1080p picture is great but I really just use the set in my office while I'm working on the web site.
So over the last few months the sound has gone out, or slowed down ~ like putting your finger on an LP to slow the sound down. Again I have to unplug the set to get it to reset. I canceled cable a few months ago so I use a Terk antenna to watch TV, or just use a DVD. I only just started over-the-air TV a few months ago and that's when the issue started. I have to assume that when the TV starts to lose the signal it gets a bit confused.

I purchased this HDTV in 2008 but it looks like they still sell it so the review is still valid. Maybe they have fixed these issues by now, but the TV has only done this stuff a few times.

Saturday, July 24, 2010

Blogger has some new features

Blogger is finally coming out with some new settings, and it's about time.
I've been on Blogger for about 5 years now and it seems that Google never adds any thing new, at least until this year. It does add new templates from time to time, but I've never been interested in changing. Now at the beginning of the year Google allowed Amazon to display ads on Blogger, so that's different.

Now Google has added Stats, so you can tell who visits the blog or what page the visitor views. Up until now I had used the stats from the ads that I run to tell how often that the blog was being visited. I also used Google Analytics to see who then visited www.interfacebus.com and what page they came from. They're both round-about ways to tell how the blog is doing, but I never got around to adding a free counter to the blogger template.

Blogger only just started counting pages last month, so I only have one month of data. The page with the most visits was written in 2008, followed by one from 2010, 2006, and 2005.

I just tried to up-load an image and I see they changed that function too. Instead of 'up-load a file' I get add from Blogger, Picasa, or from an html address ~ so I'm lost and will not be adding a picture from the stats counter??

Wednesday, June 30, 2010

URLs in Web Index

I'm going to partially re-post a blog I wrote just a few weeks ago regarding web-pages included in Google's Index. I've only added a dozen pages in the last two months so the increase in the number of pages indexed has to be due the SEO changes being made to pre-existing pages already on the web. The Index History relates to an Engineering Portal on the web.

Index History:
6/30/10 = 1,504 indexed pages.
6/24/10 = 1,455 indexed pages
6/18/10 = 1,426 indexed pages
6/9/10   = 1,411 indexed pages.
5/31/10 = 1,400 indexed pages.
5/22/10 = 1,394 indexed pages.
4/7/10   = 1,309 indexed pages.
3/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]


 That's not to bad, 12 new pages added, but 100 more pages indexed by Google.

Does the increase in indexed pages imply that all of a sudden the site will start receiving a great many more site visitors, not really. But it does mean that the newly indexed pages will show up in a Google search for that topic, maybe not first in the list but at least they will show up now.

 No Page Rank; Company index 'M'.

I think I'll run a new site map later tonight, once the traffic slows down. I'll get those dozen or so new pages included, and remove a few FAQ pages that have been de-linked over the last few weeks. I did have a few pages that are not being used but still had links pointing to them, but Google dropped them from the index long ago ~ time to removed the link and orphan the pages.
I'll add a comment to this post once Google has time to read the new sitemap.

Graphic; Number of visits to the web site 1/1/10 to 6/29/10 [mid-year update].
Year-to-date = 17.48% increase in visits over the same time period last year [192,858 more visits].

Monday, June 28, 2010

Web Page Hits per Hour

Here's one more graphic from the FAQ section of the web site. This one shows Page Visits by hour of the day.

 

Visits by Hour, May 2010
.

Sunday, June 27, 2010

Comparing Site Vists

I figured I would take another look at how the site is doing compared to last year. As of last week the site is still getting 10% more site visits than the same week last year.

Over all the web site is still getting more than 17% more web visits year-to-date, down 1% from 3 weeks ago. The first week of June had the lowest increase with only 6% more visits to the site. The largest increase in visits occurred on the last week in Jan with a 26% increase.

Over all the site is showing a 15% increase over 2008 visits, and about a 14% increase over 2007.

The graph compares web visits for 2010 and 2009 ~ by week.


Page-views [not shown] are down 4%, year to date over last year.

Saturday, June 19, 2010

FAQ-Incoming-Linked-Sites

I'm up-loading a graphic from the site to reduce both the size of the graphic that gets loaded and a separate DNS look-up that is required to load the graphic. The graphic is of Linked-sites, or sites linking to interfacebus.com from external sites. The graphic was stored out on Google Picasa, and not on the server, which required the extra DNS look-up.

So the page that held the graphic was deleted and a link pointing to this blog posting added to the pages that pointed to the old page. The point was to reduce that page from maybe 12k of html text to zero, delete the external look-up for the 100k graphic and reduce the over-all load time for the site.

Last month I did the same thing to 2 or 3 other FAQ pages to reduce the over-all loading time of the site. Remember Google rank page loading time as well as just key words.

Friday, June 18, 2010

Pages Crawled Per Day

Crawl Stats;
Looks like Google is finally slowing down checking my Engineering Site. After 2 months of heavy Googlebot activity the spider seems to be taking a break, which is good. There's really no reason Google should be reading 2,000 pages a-day. The numbers shown in the graph are High number of pages, Average number and low number of pages spidered each day.
HTML Suggestions;
I finally got down to the point that Google does not have an suggestions of fixing any meta-tags. It seems like there were a few duplicate pages that were unused that I never fixed ~ so they may have been de-listed.Here is what Google has to say;
"We didn't detect any content issues with your site. As we crawl your site, we check it to detect any potential issues with content on your pages, including duplicate, missing, or problematic title tags or meta descriptions. These issues won't prevent your site from appearing in Google search results, but paying attention to them can provide Google with more information and even help drive traffic to your site. For example, title and meta description text can appear in search results, and useful, descriptive text is more likely to be clicked on by users."

Sitemaps;
1,653 URLs submitted, 1,426 URLs Indexed.

Wednesday, June 16, 2010

Speed Performance Overview

Just a quick up-date to show the latest graph on site download time, or site performance. Google now indicates an average download time of 2.8 seconds, which is faster than 51% of internet sites. This is down from 3.7 seconds in the last posting. The previous posting was Web Site Performance, 4/11/10.

Saturday, June 12, 2010

Website Health Check

Because of the drop in web visits this month, which is common for June, I figured I would compare the number of visits this year to last.
 interfacebus.com Stats:

So there's an 11.78% increase in sites visits between June of 2009 and 2010.
June 2009; 63,814 visits
June 2010; 71,334 visits

The year to date difference is even better, with an 18.15% increase in visits
1/1 - 6/10 2009 = 994,854 visits
1/1 - 6/10 2010 = 1,175,422 visits

interfacebus [Google Sites] Stats:
Year to date for my Google Sites pages have a 104.56% increase in visits
1/1 - 6/10 2009 = 3,223 visits
1/1 - 6/10 2010 = 6,593 visits

Serialphy.com Stats:
Year to date for my serialphy pages have a 33.93% increase in visits
1/1 - 6/10 2009 = 1,441 visits
1/1 - 6/10 2010 = 1,930 visits

Knol pages [Google Knol] Stats:
Year to date for my Knol Site pages have a 42.9% increase in visits
1/1 - 6/10 2009 = 3,481 visits
1/1 - 6/10 2010 = 4,974 visits

Blogs [This blog and the other] Stats:
Year to date for my Blog pages have a 104.7% increase in visits
1/1 - 6/10 2009 = 3,679 visits
1/1 - 6/10 2010 = 7,532 visits

I guess I have to conclude the health of the sites is good!

Friday, June 11, 2010

Why You Should Blog

At first I figured I was going to just insert a reminder post about some of the reasons a web master should also run a blog. But, as I was reviewing posts looking for the last relevant posting I found some disturbing data as it relates to the rest of this blog post. I'll address that data at the end of the post, with a comment plus a few related early blog posts.

The numbers below represent visits to the Engineering Site from this blog over the last thirty days.
The date represent when the blog posts were written with the number of visits from those posts.
The number of visits from blogger represent 84 visits from 15 different pages off blogger [although they are just shown by year of post].
The point of this post is to show that even after years of writing a blog posting, it still gets visits and provides referring hits to interfacebus.com. [Why Blog; Sep 12 2007]

Referral visits from Blogger [blogspot]
by page generation date:

42 referrals from 2010 posts
0 referrals from 2009 posts
3 referrals from 2008 posts
15 referrals from 2007 posts
10 referrals from 2006 posts
14 referrals from 2005 posts

The other blog site which only shows new pages included on the engineering site brings in even more traffic.
But in the last thirty days that blog sent over 82 referring visits from 26 different blog posts.
I did notice a spike in visits yesterday which has me wondering why.

I should note that this blog had 771 page views over the last thirty days, while the 'what's new blog' only received 354 page views.
That's page views in these blogs not sending traffic to the main site which is discussed about in this blog posting.
However you can read these blogs from a newsfeed, but these numbers don't reflect that.


The graphic shows incoming visits to the Engineering Site from these blogs over the last four years. Looks like there's been a steady decline in visits from mid last year. The middle of last year I stared a newsfeed, then near the end of last year I removed some permanent links from this blog that pointed to the web site. Now the newsfeed carries all these same posts, but if some one clicks over to the web site it doesn't appear to come from this blog.

Thursday, June 10, 2010

Google Caffeine and Indexing

So Google came out with a new indexing system which updates the Google index much faster than in previous years. The new indexing system is called Caffeine [Google Caffeine], and allows for a change in its index each night.

It was common knowledge that once every thirty days Goggle would shuffle it's indexed pages, so if you were number 2 in the search engine position [index] one month that same page would change position the next month. Some times the position would get a higher position, and sometime the index change would leave the page in a lower position.

However at the same time many of the new pages I generated would show up in the index the next day because I would blog about the new page address in the New Engineering Pages Blog. Google operates these blogs [blogspot] so they read or spider them all the time. Any new page mentioned in blogger gets noticed by Google much faster before it reads any external web site [I assume]. Before I started blogging I would wait 2 to 4 weeks before a page would get indexed, but after I started blogging any new page would get picked up within several hours.

So for Search Engine Optimization [SEO], blog about any new page you generate because it gets picked up much faster that waiting for the spider [Googlebot] to find it, and much faster than having to generate a new site map for each new page addition.

So what is Caffeine? Well it's Google changing your search engine position ever day instead of every month. However; because your page position might change ever day as it was displaced by one of my new pages, I'm not really sure I see the difference. However Google does indicate a recent drop in the number of my page that are being displayed in the search engine listing, or really the number of pages that are being clicked on.


So this graph depicts the the number of times my page is shown in the search engine results [blue] and the number of times somebody clicks on one of my pages [yellow] in the search results. A drop in click-through rate [yellow] in the last few days should indicate that although my pages are still showing up in the search results, they now appear lower down in the results ~ maybe page 2 of the results instead of page 1.
But at the same time Google Analytics reports that there has been no drop in visits, and I trust the Analytics report more than I trust the report from Google webmaster tools [graphic above].

If I run the above report for the term Can Bus I see no real reduction in the click through-rate.
While the graph shows all search queries [all search terms] that relate to my website.

Wednesday, June 09, 2010

Indexed URLs by Google

I'm always watching how many of my pages are indexed in Google's search engine, and a few days ago it reached 1,411 pages indexed. Here's a few data points over the last few months [years];

6/9/10   = 1,411 indexed pages.
5/31/10 = 1,400 indexed pages.
5/22/10 = 1,394 indexed pages.
4/7/10   = 1,309 indexed pages.
3/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]

The number of pages most go up just after I add a sitemap, and then drop down a few weeks later after Google finishes reading them. However as I 'fix' the pages they slowly start to re-appear as indexed, but at a slower pace. Fixing a page is adding more data so it no longer appears as the page it was copied from.
No page rank; Component Manufacturers. Just one example of a page not indexed.....

I'm not going to show the graph but Google indicates that on average Google will spider 862 pages per day ~ every day. What is up with that, the site only has 1,600 pages? Google used 537MB of server bandwidth as the spider Googlebot down-loaded from the website.

Tuesday, June 01, 2010

Site views and page ranking

So the numbers are in for last month. Site visits are down over the last month, but over-all page views have increased a bit. Now if the numbers follow the same yearly trend there should be an even bigger reduction in site visits next month. I think I'll wait until next month before I re-post the page visits graph, there's really no need to post the same graph every month.

After I noticed yesterday that a few more of my pages have been included into the Google Index, I figured I would try and find a few pages that are yet to be added. It didn't take to long to find a year old page with out a page rank. The links are Component page followed by the mention of the component in the 'whats new blog'.
How to Derate 2N4239 Transistor. [posted on 6/4/2009 When to Derate a Transistor]
How to Derate a 2N3743. [posted on 1/30/2009 Derate a BJT by Temperature]
How to Derate a 2N3485 Transistor. [posted on 2/12/2009 Recommendation on how to Derate a Transistor]
How to Derate a 2N4931 Transistor. [never posted in the blog].

I found these pages by looking at the report from Google Analytics, and clicking on a few pages that were getting almost no page views. None of these four pages have a Google page rank, and have only received a few page views this year. Of course re-posting their links here may not help these pages at all, but it may help them get re-spidered ~ as they were just up-dated.

Monday, May 31, 2010

Link Checking

Just a quick note to indicate that the links on the engineering site have just been checked and verified. There were a few bad links, but not that many. I still have some sites to check within the next few days, but I need to give the web sites time to come back on-line.

The attached graphic shows the data generated by Xenu. It appears that at a minimum there were 99.29% good links on the site, and that number should increase as I check the rest on the remaining links in the report.

The last time I showed this report was in November 20 2009 as Xenu Report, Statistics for Managers. That report shows 99.25% good links, with a total number of URL links as 6950 URLs

Of course there is one day remaining in the month, but the site numbers look kind of low. Today should not be any different because the site never does well on holidays. This will be the lowest performing month of the year, still better than any month of any previous year. But I should be posting those numbers tomorrow, as my server counter does not up-date until about 5am.

Of course if you have off-page links on your site you should always check them to insure they point to a valid page address. I use a free link checking program called Xenu.

As a side note; the site visits so far this year have past all of the visits for the entire 2005 year.

Sunday, May 30, 2010

Mezzanine Buses

There are a great many Mezzanine buses, some more out-dated than others. Mezzanine cards are small form factor boards that reside as a daughter card on a VME or cPCI mother-board to name a few interfaces. Mezzanine cards to not directly interface to the main system backplane as does VME for example. Most of the Mezzanine interface on this site are represented by two separate pages; one page carries the particular board manufacturers and another page covers the description of the electrical interface and mechanical form factor.

The oldest of the current Mezzanine board formats on the market is the IP Card or Industry Pack I/O Modules. Basically a non-intelligent board format used in a number of systems, a bit dated at this point in time, but there are still Companies Producing IP Modules. Page views are almost down to zero for this board format, which seems odd as it was an I/O based module.

The M-Module which is just as old still receives a few page views, but there doesn't appear to be much support in Producing M-Module Cards.

The mezzanine board format that replaced the IP board was the PMC format, or PCI Mezzanine Card. The PMC board added a controller but still allowed for any required I/O, previously handled by the IP card. So there are still many Companies Producing PMC Boards. A variant of the PMC interface is the PMC-X format. Both of these board standards used the PCI bus as the electrical interface, so really they are somewhat out dated [PCI Card Manufacturers].

The follow-on to the PMC standard was the PPMC interface, or Processor PMC format. There are a few companies that Produce PPMC Boards which were true processor based cards, on a PCI bus. Another related was board format is the PTMC Interface, or Telcom PMC standard [still using the PCI electrical interface].

These are all open standards so the board specifications could be used on any carrier card, but some interfaces were designed specifically for the VME bus, others for the cPCI Interface, and still others for the AdvancedTCA Interface. The AMC Mezzanine card was designed to interface to the new ATCA standard, being relatively new there is a small but growing number of Companies Producing AMC Boards.

Additional mezzanine boards include FMC,  and XMC standards.

Thursday, May 27, 2010

Search Key Word vs. Search Key Phrase

For a number of years now I've noticed that many of my pages do not do well in a Google search when a one word phrase is used. I've never been able to figure out why my pages do better with two or more key words rather than one key word, but I guess I can't show up first for every key word.

So one example is the search term VME, which would be a one-keyword search. The VME page shows up in the ninth position on the first page. When the phrase is changed to VME Bus, the page shows up in the forth position [same for VMEbus]. When I add another key phrase, VME Bus Pinout, the page shows up as the first listing on the first page. First listing for the key-words VME Backplane, and VME Chassis also. Hmm, looks like VME Connector places first too, in both a text and image search.

I'm using the VME Bus as an example, but I could have used any computer bus because the same thing happens. I just don't under stand how Google determines what page should show up first when only one key-word is used.

Yes I know the rules, those other pages must be using the key terms more often at the beginning of their article. I' don't know, my pages do show up but I would rather have them show up higher for a few one word searches. I guess I need to somehow optimize the VME interface page up that it shows better in a Google search, maybe rewrite some of the text.