Friday, December 24, 2010

ReadyBoost Compatiblity with Windows 7

ReadyBoost is a procedure for using a USB flash drive to augment system memory in a windows PC. ReadyBoost started with the introduction of Windows XP, and continues with Windows Vista and now Windows 7.

I've already written a Google Knol on a basic introduction to ReadyBoost on Vista so this posting really relates to Windows 7. Read more on the USB Interface, part of an Engineering Web Site.

Basically ReadyBoost allows a user to plug in a USB thumb drive [or other flash memory card] into a USB slot and set that USB drive to act like available system memory [boosting main memory]. I did try using ReadyBoost with Windows Vista on my previous computer but never really noticed any change. However three weeks ago I purchased a new computer, which I'd like to check out.

The new PC is a Dell Studio XPS8100, with an Intel Core i7-870 processor running at 2.93GHz, using
8GB DDR3 SDRAM system memory and 8MB cache. Like any personal computer the Dell uses revision 2.0 of the USB standard. Although there are 3.0 USB flash drives, there do not seem to be any computers supporting revision 3.0 of the USB spec yet.

Anyway I figured I would investigate how Windows 7 handles ReadyBoost. First off I see that Windows 7 does support ReadyBoost, but recommends adding a flash drive having a minimum of twice the system memory [16GBytes in my case].

Doing a quick check at one retailer I find that a 16GB USB thumb drive costs from $22.99 to 79.99. By comparison four 4GB DDR3 memory sticks [16GB] cost $290. So the main memory might be costing more than 3 times that of the same size thumb drive, but it's also operating much faster than the USB drive. So USB and ReadyBoost are a quick and cheap fix to enhance system operation. It doesn't matter, I don't need to purchase more memory for a PC I just received, but I would like to see how ReadyBoost works. I only use 24% of the PCs memory now anyway [indicated by some gadget on the desktop].

The price range variation in thumb drives brings in the next issue Windows 7 had with USB drives. Microsoft indicates that ReadyBoost will only work with "Fast" flash memory, and will not function with "Slow" flash memory. The help file goes on to say that some flash memory devices may contain a combination of both. So USB thumb drives using slow flash memory will not work at all, while 'faster' USB drives may not be able to use all their available memory [the 'slow' portion]. The definition or difference between fast flash memory and slow flash memory alludes me. The difference between fast and slow may explain the wide range in prices between USB drives; that being $29.99 is slow memory and $79.99 fast memory.

Most of the USB drives I just looked at did not indicate any transfer speed, although I do list them from one particular manufacturer below. Having no transfer speed data, and no knowledge of what a 'fast' transfer speed is anyway, indicates that I should only purchase a USB thumb drive that indicates that it is ReadyBoost compatible and maybe even Windows 7 compatible [in case this fast/slow issue is new to Windows 7].

Of the four different 16GB thumb drives available from one particular store, only one indicates that it supports ReadyBoost. Others do indicate that they perform fast transfers but it seems more of a comparison to USB version 1 than any indication of true transfer speed. Looks like I'll be ordering a 16GB USB 2.0 Flash Drive from PNY in the next few days to test out ReadyBoost.

USB 2.0 Transfer Speeds:
Read Speeds; 10MB/sec., 24MB/sec., 25MB/sec., 30MB/sec.
Write Speeds; 5MB/sec, 8MB/sec., 10MB/sec., 20MB/sec. 

Note that the USB standard does use terms like Slow-speed, Full-speed and High-speed, but how many people read a technical specification? Regardless, how do the terms used in the USB spec relate to fast and slow used by Microsoft?

I'll up-date the post when I receive the new thumb drive, lets hope I don't lose it at the rate I'm losing all my other thumb drives.

Saturday, December 18, 2010

What Video Card Should I get

As of 2010 there are currently 4 major video interfaces used on either computers, PC monitors, televisions or both. The oldest is the VGA interface which started to appear on PCs in 1987.

The VGA interface has been upgraded a number of times and is now called the SVGA interface, although everyone still uses the generic term VGA to describe the interface [see note]. Even with its age and the fact that its the only analog interface left on a PC, it can still be found on the latest monitors and TVs for compatibility.


The search trend below tells the story, search interest in the term VGA [orange] has only dropped off slightly in the last 6 years. Now, there is no way to tell because the data is normalized, but the drop represents a large reduction in the number of searches.




The DVI connector is the second oldest video interface. The DVI was introduced in 1990 as a replacement to the analog VGA interface and was capable of both analog or digital operation. However the introduction of HDMI made the adaption fall off in the last few years. In fact the organization that developed the DVI specification disbanded in 2006. The graph shows a drop in DVI interest [Red], falling at about the same rate as the VGA interface. But at this point I think every body knows what a VGA interface is, its been twenty years.


The two interfaces showing an increase in searches are the HDMI output and the DisplayPort interface. From the graph, interest in HDMI appears to be growing faster the disinterest in DVI or VGA; but that make sense with so many fielded systems using either VGA or HDMI. The large spikes represent increased searches, probably due to news articles or new products being introduced.


The blue line down at the bottom of the graph represents Google searches for the term DisplayPort. It may appear that there is no interest in the new DisplayPort video interface, but that is only when compared to the vast number of searches being conducted for the other video card types. If you happen to zoom in, or re-normalize the graph with out the other video interfaces than the increase becomes apparent [shown below].

 Interest in Displayport has doubled over the last few years, up four fold from its release date.

Many times when a interface standard is released it takes another year before products begin to hit the market. In the case of a video standard you need at least two different manufacturers producing products; one selling a video card and a different manufacturer selling a computer monitor. Then there's the issue of demand; a company making PC monitors may not want to go to the expense of designing a new interface when the mating interface is not yet available on a video card. So in some cases it may take a few years for a new video standard to take hold.

For example DisplayPort may have been released in 2007, but video cards with a DisplayPort interface may not have appear until 2008 followed sometime later by computer monitors.

Anyway the best video monitor interface to use is DisplayPort, or HDMI on a TV. They're both digital interfaces, but HDMI is more common than DisplayPort.

Note; The search trend for the term VGA shows a dramatic drop in search usage from 2004 to 2007 and a steady decline onward. But when compared to the term SVGA, SVGA does not even register on the same graph. In other words the term VGA is being searched for 20 to 25 times more than the term SVGA.

In general I keep my PC for around two years and the monitors for about four years. The two AOC 2436 monitors I'm currently using [AOC 2436-vh review] has both a HDMI and VGA interfaces. So they don't have the newest DisplayPort interface or the out-dated DVI interface, but a good  mix of both analog and digital connectors. My Dell SX8100 computer uses an ATI Radeon HD 5770 video card with dual-link DVI, DisplayPort and HDMI outputs. Note the lack of a VGA interface
.

Tuesday, December 14, 2010

Browser Compatibility

2010 Browser Usage
For years now I figured that most issues between the different web browsers had stabilized, or at least worked fine with my web pages. The html code used with the Engineering Web Site is pretty basic, nothing special, but there was a time when I checked the pages between the different internet browsers.

Primarily I would be looking for layout changes between the different browsers, small changes in spacing or page breaks. As a rule the pages were always working between the different programs. But all that was a long time ago, I stopped checking page issues half a dozen years ago. Back then I would load all the different web browsers on the PC and check pages for compatibility.

Now I just run the three major browsers, Internet Explorer, Firefox and Chrome. However I don't run them looking for problems, I run them with each one opening different windows and or passwords.

Anyway over the last few months I've noticed page layout issues with my unique 404 page [the page that displays on a mistyped web address]. I sort of looked at it but never really did any thing about it because I couldn't find the problem. Now I realize that the layout problem was only showing up in Google Chrome, which I hadn't noticed before. With three different browsers it's kind of random which pages show up in which browser depending on what I'm working on.

So I have three different style of pages, and two of those styles use the HTML 'DIV' tag which puts a column on either the right or left side of the page. The pages that use the DIV tag to put a column on the right side of the page seems to work in all three browsers. However the pages that use a DIV to add a left hand column don't seem to work in Google Chrome. Any text or graphic that uses a 'center' tag is forced out of the column and below the content of the main section. So left margin text remains in the div column while a 'center' graphic is pushed out into the adjacent column.

Lucky for me, the pages with the issue only account for 1% of the page-views or around 37,000 pageviews. Ten percent of that are people using Google Chrome viewing only 370 pages in these sections. The 14 pages that account for most of the issue [0.67%] were just fixed as well as some of the others. The thing is, if you never scrolled down to the bottom of the page, the layout problem might not have even been noticed.

Oh, the fix is to just remove the center tag around the graphics in the left column. That was the quick fix, there must be an html coding issue with the div tags, but the point is to get them working right now....

So the advice would be to check you pages in all the browsers.

The pie chart represent 4,221,889 pageviews year-to-date for the web site.
239,690 of those page views were from people using Google Chrome.
.

Tuesday, December 07, 2010

Google Sites Free Web Page

V22 Osprey
I don't often blog about how my Google Sites page is doing, but as part of an end-of-year thing I'll go ahead and add a few words. I call the site interfacebus, just like the main site. Google sites is free, so I can't really have to many complaints, but it's a bit hard to organize the pages by topic.

Each sub-page has the address of the upper level pages so the URLs get pretty long. Or you really need to plan ahead when generating new pages. However I don't know what page I'm going to add at any given time. So some of the pages addresses are really long. I guess it doesn't matter if a person comes to the site via a search engine, or if their navigating the site via the site-map.

The graphic is part of a report from Google Analytics, like many of the graphs in this blog, comparing last years hits or visits to this year. This year the site has had 14,532 visits, and 20,903 pageviews, over 165 different URLs. However only 133 pages are included in Google index.

Many of the pages really just hold some large graphic that didn't fit on the main site, but in most cases there is a small amount of text. I don't add pages as often as on the other site, but much of the increase in visits is due to new pages, or maybe new pictures.
.

Saturday, December 04, 2010

Increasing Page Views

I don't really track the number, but I add around two or three new pages to the Engineering Site each month.

Of course I would hope that the new page(s) would generate more incoming visitors, but there's little chance of that. Remember it takes three months just to get a page rank and start to show up in the Search Engine Pages [SEP]. Maybe another month before that until Google spiders and reads the page. Or any page I generated in the last half of the year had no chance of receiving any page views.

So the table below compares pageviews from this year to last year. It might seem like that there's not much of a change, but most pages increased in page views. However many pages still hover around the first three rows. So many of the pages I added only received just a few page views, with almost no page views during the first three months.

However there's another way to look at the situation. A new page may bring in a new visitor, which might revisit the site.


The increase in pages in the first row are new pages. Most of the rest of the pages are stuck there.
There are 50 pages that make up the 6 year old Index of Semiconductor Manufacturers section that refuse to get a Page Rank, no matter how many times I link to them. Most of the 3 year old 115 pages that make up the Transistor Derating Curves section also have never received a Page Rank, regardless of what I've done.

I just updated a 3 year old page that only received 8 page views this year. I pulled a 90k pic file off Picasa, reduced it to 50k and stored it on my server. I deleted another Picasa graphic altogether because it could be found by an on-page link. I also changed a few important key words. The point was not to get more pages views, its transistor part number [page topic] must be obsolete so the page will never receive any page views.

Instead what I did was make the page look less like the other 115 similar pages in that section, and make it load faster [for Google] to keep them happy. Files that are stored external to the site require a DNS look-up, which Google thinks slows down a site ~ part of Google's rating systems measures page loading speed. The 50k file on my server should not hurt because the page is never viewed, except by the web spiders. So having this page appear different may help the pages getting page-views, because one less page looks the same.

Anyway if a page received 1,000 page-views last year it would have to see over a 900% increase to move up to the next row. So understandably most pages stay in the same row they were found in last year, even if they had a 100% increase in views.


The site is currently receiving 7.21% more Pageviews than last year, with 3,978,019 current pageviews. So the site has already passed the number of page views received last year. The table above is Pageviews, the chart to the left is number of visits.

The program that generates the data is Google Analytics, and the report is Top Data, order from highest pageviews down [which I then just count]. Using Google Analytics requires a small amount of Java code on each of your web pages.

Many of the pages in my report with only one page view for the year are really just mis-spelled page addresses. The viewer just sees the sites 404, page not found, but the report records it as the address that the server received. Many types of 'wrong addresses can be filtered, but many can't, sometimes I can't even tell if it's a real page address with out pulling up the page. Opening the page will give it another count which could cause it to look like a real page address when I check it later. So the first row of data is an estimate, because I started to get down to the misspelled pages in the report.

This is a long post just to say that adding new pages or worrying about getting those pages included in the Google index may not help the over-all web site. You still need to come up with content that people are looking for.
.

Friday, December 03, 2010

Internet Browser Usage

Being the end of the year I figured I would post which browser visitors use for the engineering data base. Some people watch these numbers, and even small changes make one of the technical papers. But like me, once in a while they probably just need something to post about.

I could care less which browser is doing better than another, because I use all three at once. I have some pages open in Firefox, others in Explorer and still others in Google Chrome. I just wish they would put the buttons and options in the same place in each of the programs.

The graphic shows the steady decline in Internet Explorer usage over the last four years, and the steady increase in Firefox usage. Although Firefox has only increased less than 1% from 2009. It would appear that Google Chrome is taking the steam out of Firefox's usage, increasing 6% over the last year. I assume 6% Firefox would have gotten if not for Chrome. Safari has also passed Opera usage for the first time, but only by a half percent.

I don't use either Opera of Safari, but years ago I did try Opera for a while. But like I said, as soon as the PC boots up, the other three browsers open.... once. The data for 2010 account for 3,094865 unique visits so it's a big data sample. Counting the return visitors would make the data sample larger, but what's the point, it would just make the browsers on the increase look better and the ones on the decline look worse.

There are other browsers being used to access the web site but their percent usage are all below 1% ~ Below 20,000 web visits, or two days of web visits. The site receives about 9,000 hits per day.
.
.

Thursday, December 02, 2010

URLs in Web Index

This would be the third revision or up-date to the posting of the web stats showing URLs in Web Index.
The stats relate to this Engineering Web Portal.

The table of data below shows the number URLs or pages in Google's web index. First of all I don't like to generate new site maps that often because of the bandwidth requirements of the sitemap program. That is the program I run needs to check every page I have on the server, so it's a big hit to my server. But if you look at the data and the increasing number of URLs in the web index you will notice that generating a sitemap over and over again is not required.

However I do recommend running a new sitemap if you have a large amount of new pages. The number of URLs in the web index always increases after a site-map is generated. However you'll notice that in some cases the number of indexed pages falls off a few days later. I assume the program reading the sitemap finds the new pages and adds them to the web index, but sometime later another computer algorithm determines the page should not be index and they drop off the index.

01/21/11 = 1,938 urls in web index [Site-map loaded]
11/30/10 = 1,875 urls in web index.
11/04/10 = 1,854 urls in web index.
10/23/10 = 1,805 urls in web index.
07/24/10 = 1,779 indexed pages.  [Site-map loaded]
07/01/10 = 1,535 indexed pages.  [Site-map loaded]
06/30/10 = 1,504
urls in web index.
06/24/10 = 1,455 urls in web index.
06/18/10 = 1,426 urls in web index.
05/31/10 = 1,400 urls in web index.
05/22/10 = 1,394 urls in web index.
04/07/10 = 1,309 urls in web index.
03/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]


There are two reasons why the number of URLs in the web index are increasing. First I generate at least two or three new pages every month, sometimes many more. So you would except to see an increasing number of pages included in the web index as those new pages are found and included. Secondly I'm always updating preexisting pages already residing on the web-site. So after time the up-dates may allow a page to be indexed, usually because it has more content [text]. Many times I'll add a new page topic that is little more than a graphic and a small description. But over time I get back to up-dating the page to include more data and so on. Once the page gets 'the required amount' of content Google included the URL into the web index.

How soon your new page shows up in the web index, with out a sitemap, depends on how often your pages are crawled. The chart shows that my site has 500 pages crawled every day, although you can't tell from that how many pages are re-crawled. So for my site any new page added is found within a few days, I assume 15 days ~ so I don't really need a sitemap.


Although it has little to do with the number of URLs in the web index, over and over I see new pages not receiving any incoming hits for months. That is even as a new page gets included in the web index, it takes three months for the page to really start getting any visits. Web Masters call that the Google sand box. So don't think that just because a page gets indexed that it will bring in a larger number of hits the next day. Also I have groups or sections of pages that have dropped of the index, for years now. Currently I have 1,946 URLs submitted and 1,875 URLs included. That three month "down-time" is also the same amount of time it takes a page to get a Google Page Rank. [if it ever gets one].

2N3485 Transistor Derating. 3 year old page with zero page rank.
Semiconductor Manufacturers 'L'. 5 year old page with zero page rank.
Electron Tube Classification 2 week old page with 0 page rank.
74L121 Monostable Multivibrator IC. 1 week old page with 0 page rank.

Note always run your site-map generator program at night or when you expect low incoming traffic. Remember that program is talking to your server, which would be in direct competition with your visitors.

Also I have no idea how to determine which pages of my site are not included in the web index, other wise I might fix those pages.

The number of web pages; URLs in web index change every week or so, the numbers listed above are just the ones I wrote down.
. err click the title to read the comments, if you didn't come to this specific topic. The blog compresses the comments section unless you're on that particular page. The comments are updates to the blog post.
.......

Wednesday, December 01, 2010

Mobile Device Usage Statistics

As part of up dating the FAQ for the Engineering Site, I'm inserting a posting about which mobile devices are accessing the web site. I haven't checked this data type since the last post a year ago, looks like there's been a change. Incoming visits from mobile devices have doubled over this last year. However doubling a small number is still a small number. The Frequency Ask Questions pages don't get many hits, but I still like to keep the information updated. In order to save server bandwidth I redirect some of the FAQ data out to this blog, but it's still an interesting data point for blog readers.

I tried to check for a percentage change year-to-year in the mobile phones using the site, but for what ever reason Google didn't have it [or said it was estimated]. So the only thing I'll show is this years data and which phones are being used the most, but no change from year-to-year. Of course most of the visits [46%] are coming from an iPhone which is understandable.

A few years back there was a particular way a web page had to be set up to work on a mobile device ~ I assume that's not required any longer with the newer systems and bigger screens. Any way it doesn't matter I don't have time to rewrite 2,000 web pages just for 3,000 hits a year. The site receives 9,000 a day from people on a normal computer. But it is nice to see an increase in any of these reports I pull up, makes me think I'm getting some place.

This graph shows visits, which are increasing. However all the other metrics are worse than normal computer visitors except percent new visits. Meaning most people coming in are new to the site, but they leave faster and view fewer pages than normal computer 'people'.