Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Saturday, March 19, 2011

What is Googles Farmer Update

The Farmer update, or Panda update was one of a few hundred algorithm changes Google makes to its search each year. The Farmer update got its name because the change was aimed at removing content farms from the search results.
A content farm is a web site set up just to trap key-word searches, but really has little or no content. Maybe hundreds or thousands of articles designed to bring in search traffic. However the articles are usually poorly written and are just filled with key words to attract traffic.

The Farmer algorithm update went live in the US on February 24th, with an update to non-US searches some time later but has yet to be announced. So if you have a site in the US and saw a decline in search traffic from Google starting on or after the 24 of last month you could be effected by the Farmer update.

One of the few comments Google has made regarding the update is that a weak page on your site could effect the entire site [which is some what new]. In the past it seemed that a weak page with little or no content would just never be indexed by Google, but never seemed to draw down other pages in the web site. Any way the forums are full of people giving advice, but the poster are just guesses.

So any way I wanted to detail a few changes that I made to the web site, not that I know I've been effected.
1. Deleted a dozen orphan pages from the server. The orphan pages had been out there for years and were mis-spelled page addresses or abandon pages that were no longer used. At one point they had news groups or what ever linking to them, so I saw no reason to show a 404 page not found. But after a few years, I'm sure that no one is reading those old news group posts and clicking in.
2. Redirected another dozen pages to something else. These page were being used, but had more weight years ago. As time has past they slowly lost content due to links going bad, or no interest from me in trying to fix them; however 10 years ago they were valid pages.
3. I removed about a dozen pages relating to advice on using Adsense, as their page views had been falling over the last few years and they did not have any relation to an engineering topic.
4. I removed a few ads on some of the pages, only because the content had decreased over time, because links on the pages were no longer working. Removing an ad was part of a normal page review and had nothing to do with the algo update. How ever some people in the forums are saying to many ads could effect a pages ranking, and pages with ads above the fold ~ I think that's garbage.
5. I fixed a number of bad links on the site, but I do that all the time. Out-going links are always going bad
6. I removed dozens of blog posting from early years on this blog. I've also done this from time to time in the past. Some times I'll write a blog post to keep the blog going, but in the big picture the post is of no interest to any one but the few people following the blog. But if Google is going to rank the entire blog based on these thin posts, then it's better just to remove them. This blog had 534 posts in Dec, but now has 512.
7. I've been optimizing HTML code on the site to reduce loading time, but I've been doing that for a year now.
8. Removing pic files from Google Picasa and storing them on the server, which should help with page loading and I've also been doing that for a year now.
9. I removed the Google tool bar because it seemed it was tracking me more than the site visitors. The tool bar measures page loading time and was showing wide variations when the load times should have been more even. I figured the tool bar was tracking 'my' page loads bringing up slower pages looking to optimize. In other words I would open a 'slow' page to fix, but showing Google a slow page. So Google was seeing more slow pages because of my tool bar usage than it normally would.
10. Updating pages with the new faster Google Analytics tracking code. I was doing that any way, and again to decrease page loading time.
11. Updating or enhancing page content ~ I do that any way.
12. Deleted or removed dozens of out going links, that were valid but seemed to be outdated ~ maybe they had a copy right of 2004 on their page. Normally the page had to many links any way, so no harm done.
13. I added a few more no-follow tags to some out going links, again their pages seems outdated.

I've been making these changes either before or after the 24th of last month and I've seen no difference in page views. However any decrease in page views that has occurred has really only occurred to a few high page-view pages, other pages are up or down as normal.

Comparing Feb to March the site is down 10%
Comparing Jan to March the site is down 0%
However March is normally higher than either of the previous months.

At the same time it could be that my site was not hit by the Algo change, rather some other sites are just doing better in the SERPS now ~ there's no way to tell. In any event these changes are normal Search Engine Optimization or SEO type changes.

Monday, August 30, 2010

Download Speed vs Site Performance

I check Google's data on the Engineering Site all the time, again today I rechecked the Site Performance. At least for the last few weeks the time it takes to down-load a page has decreased [always a good thing]. The attached graphic shows the amount of time it takes Google to down-load pages on the site. The numbers don't represent a single page but the average of some number of pages Googlebot tried to read. I don't really think the data is that accurate, because it never takes 4 seconds to download a page, but I read the data because that's what Google is looking at to determine site performance.

There is no way to tell why my site is coming in faster. The server could be working faster, Google pulled fewer pages or Google pulled small pages; who knows? Well I can check and see how many pages Google crawled per day and it's about the same between now and mid June. Google Crawl Stats indicate an average of 673 pages per day over the last few months.

However; I've been working on making the site faster for months, but not really getting anything to work. The basic problem is that any time I update a page and make the HTML text smaller [less code], I add more data making the page larger. So I may make the html code more efficient [decreasing download times], but I add more human readable text increasing the download time.

The current data [below] indicates the average time to download pages on the site takes 2.6 seconds. This is only important because Google rates site by download speed, so it's Search Engine Optimization [SEO].

Related Blog Posts;
Page Download Times [7-15-2010] 3.1 seconds to download
Speed Performance Overview [6-16-2010] 2.8 seconds to download
Web Site Performance [4-22-2010] 3.7 seconds to download
Google now ranks pages by speed [4-14-2010] No speed data.
Website Speed Performance [4-3-2010] 3.7 seconds to load.

Wednesday, June 30, 2010

URLs in Web Index

I'm going to partially re-post a blog I wrote just a few weeks ago regarding web-pages included in Google's Index. I've only added a dozen pages in the last two months so the increase in the number of pages indexed has to be due the SEO changes being made to pre-existing pages already on the web. The Index History relates to an Engineering Portal on the web.

Index History:
6/30/10 = 1,504 indexed pages.
6/24/10 = 1,455 indexed pages
6/18/10 = 1,426 indexed pages
6/9/10   = 1,411 indexed pages.
5/31/10 = 1,400 indexed pages.
5/22/10 = 1,394 indexed pages.
4/7/10   = 1,309 indexed pages.
3/27/10 = 1,322 indexed pages. [Site-map loaded]
12/19/09 = 1,481 indexed pages. [Site-map loaded]
12/13/08 = 1,318 indexed pages. [Site-map loaded]


 That's not to bad, 12 new pages added, but 100 more pages indexed by Google.

Does the increase in indexed pages imply that all of a sudden the site will start receiving a great many more site visitors, not really. But it does mean that the newly indexed pages will show up in a Google search for that topic, maybe not first in the list but at least they will show up now.

 No Page Rank; Company index 'M'.

I think I'll run a new site map later tonight, once the traffic slows down. I'll get those dozen or so new pages included, and remove a few FAQ pages that have been de-linked over the last few weeks. I did have a few pages that are not being used but still had links pointing to them, but Google dropped them from the index long ago ~ time to removed the link and orphan the pages.
I'll add a comment to this post once Google has time to read the new sitemap.

Graphic; Number of visits to the web site 1/1/10 to 6/29/10 [mid-year update].
Year-to-date = 17.48% increase in visits over the same time period last year [192,858 more visits].

Wednesday, June 16, 2010

Speed Performance Overview

Just a quick up-date to show the latest graph on site download time, or site performance. Google now indicates an average download time of 2.8 seconds, which is faster than 51% of internet sites. This is down from 3.7 seconds in the last posting. The previous posting was Web Site Performance, 4/11/10.

Thursday, June 10, 2010

Google Caffeine and Indexing

So Google came out with a new indexing system which updates the Google index much faster than in previous years. The new indexing system is called Caffeine [Google Caffeine], and allows for a change in its index each night.

It was common knowledge that once every thirty days Goggle would shuffle it's indexed pages, so if you were number 2 in the search engine position [index] one month that same page would change position the next month. Some times the position would get a higher position, and sometime the index change would leave the page in a lower position.

However at the same time many of the new pages I generated would show up in the index the next day because I would blog about the new page address in the New Engineering Pages Blog. Google operates these blogs [blogspot] so they read or spider them all the time. Any new page mentioned in blogger gets noticed by Google much faster before it reads any external web site [I assume]. Before I started blogging I would wait 2 to 4 weeks before a page would get indexed, but after I started blogging any new page would get picked up within several hours.

So for Search Engine Optimization [SEO], blog about any new page you generate because it gets picked up much faster that waiting for the spider [Googlebot] to find it, and much faster than having to generate a new site map for each new page addition.

So what is Caffeine? Well it's Google changing your search engine position ever day instead of every month. However; because your page position might change ever day as it was displaced by one of my new pages, I'm not really sure I see the difference. However Google does indicate a recent drop in the number of my page that are being displayed in the search engine listing, or really the number of pages that are being clicked on.


So this graph depicts the the number of times my page is shown in the search engine results [blue] and the number of times somebody clicks on one of my pages [yellow] in the search results. A drop in click-through rate [yellow] in the last few days should indicate that although my pages are still showing up in the search results, they now appear lower down in the results ~ maybe page 2 of the results instead of page 1.
But at the same time Google Analytics reports that there has been no drop in visits, and I trust the Analytics report more than I trust the report from Google webmaster tools [graphic above].

If I run the above report for the term Can Bus I see no real reduction in the click through-rate.
While the graph shows all search queries [all search terms] that relate to my website.

Monday, May 31, 2010

Link Checking

Just a quick note to indicate that the links on the engineering site have just been checked and verified. There were a few bad links, but not that many. I still have some sites to check within the next few days, but I need to give the web sites time to come back on-line.

The attached graphic shows the data generated by Xenu. It appears that at a minimum there were 99.29% good links on the site, and that number should increase as I check the rest on the remaining links in the report.

The last time I showed this report was in November 20 2009 as Xenu Report, Statistics for Managers. That report shows 99.25% good links, with a total number of URL links as 6950 URLs

Of course there is one day remaining in the month, but the site numbers look kind of low. Today should not be any different because the site never does well on holidays. This will be the lowest performing month of the year, still better than any month of any previous year. But I should be posting those numbers tomorrow, as my server counter does not up-date until about 5am.

Of course if you have off-page links on your site you should always check them to insure they point to a valid page address. I use a free link checking program called Xenu.

As a side note; the site visits so far this year have past all of the visits for the entire 2005 year.

Thursday, May 27, 2010

Search Key Word vs. Search Key Phrase

For a number of years now I've noticed that many of my pages do not do well in a Google search when a one word phrase is used. I've never been able to figure out why my pages do better with two or more key words rather than one key word, but I guess I can't show up first for every key word.

So one example is the search term VME, which would be a one-keyword search. The VME page shows up in the ninth position on the first page. When the phrase is changed to VME Bus, the page shows up in the forth position [same for VMEbus]. When I add another key phrase, VME Bus Pinout, the page shows up as the first listing on the first page. First listing for the key-words VME Backplane, and VME Chassis also. Hmm, looks like VME Connector places first too, in both a text and image search.

I'm using the VME Bus as an example, but I could have used any computer bus because the same thing happens. I just don't under stand how Google determines what page should show up first when only one key-word is used.

Yes I know the rules, those other pages must be using the key terms more often at the beginning of their article. I' don't know, my pages do show up but I would rather have them show up higher for a few one word searches. I guess I need to somehow optimize the VME interface page up that it shows better in a Google search, maybe rewrite some of the text.

Thursday, May 06, 2010

Clickthrough Rate

Over the last month or so I've been updating pages to display the new Google Search bar, primarily because it uses 1kB less html code. So far just over a thousand pages have received the up-grade with the new code. But what about all the other page metrics I have to worry about, maybe it's time to work on a few of those; as in:

Page Bounce Rate
Pages with low Pageviews
Pages with a high Exit Rate
And on and on; there's just so many different ways to measure how a page is doing on the internet.

One metric I don't look at that often is search engine Click-Through Rate, or the amount of times a page shows up in a Google search but is not clicked on. Now the search term used or the query could be any term as long as it causes my engineering website to show up in the results returned in a Google search.

Of the 6,091 queries, Google provides no data at all for the lower 4,500 searches. I assume because the page impressions are to low [a percentage] to track; however I really don't know. There are another 1000 queries that display an impression but no Clickthrough data, maybe because the clickthrough rate is below 1%. Because the first term with a clickthrough shows up with a 1% clickthrough rate; which would be the worst search term with any data. The 'worst' Google search term in the report is "DVI" which I assume relates to the page I have on the DVI interface. That DVI page shows up on the first page as #8 out of 32,500,000 results. The DVI search term causes my page to be displayed in the Google search results 6,600 times with only 58 people clicking on the link [over the last month].

So there are thousands more search terms with different impressions and click-through rates. For example the term 'Derating' had only 58 search impressions but 12 click-throughs. So the question is, should I work on pages that have no or low search impression or web pages with a low clickthrough rate? What-ever; working any page based on this data would be Search Engine Optimization [SEO].

Related Blog posts 
[Custom Search Bar 4/2/10]
[Web Site Speed Performance 4/3/10]
[Web Site Speed Enhancements 4/15/10]
[Web Site Performance 4/22/10]

Graphic: Top Search Queries as Page Link Impressions vs. Clickthrough Rate.

Thursday, April 15, 2010

Web SIte Speed Enhancments

So to follow up with 3 of the last four postings regarding page downloading times and so on. I'll go ahead and detail a few of the things I've been doing to the web site to speed up down-load times. Or really to decrease page down-load times, depending on the page that received a change or not. I really can't make a single change that would effect the entire web site, each page is completely separate from another page.

So the new Google search bar is now on 684 pages. Each page that gets the new search code sees a reduction in html code or text of 1,480 characters [1,480 bytes] ~ I started replacing the search bar code a few days before I posted about it with Custom Search Bar. Just 1k Byte may not sound like a lot of data, but in fact it is when you consider how often these pages are down loaded, perhaps 15,000 page requests per day. It's a big saving on server bandwidth [over time] and Google sees the page as 1K smaller too, which was the point. People really pay for server bandwidth by the month so my reduction would be 400,000 pages x 1,480 Bytes, once all the pages get switched over. The second benefit is that the old search code from Google used an Engineering site logo which Google saw as another DNS look up that was a drain on the page loading time [logo stored off-site].  So this change saves the site 1 DNS look up and 1KB per page.

Using data from Feedburner [the blog feed] and Adsense [advertiser] I determined that there were not that many people reading the blog as a news feed. Plus the news feed was not generating any revenue, so I decided to remove the feedburner banners from the web site. Right, why publicize; the banners take up space, slow the page down and produce no income from the site. In addition the banners required 663 characters of html text and required an additional DNS look-up. The down side is that the banners were only on about 6 pages, so the savings is small, but those 6 pages should make the entire site appear faster [to a small degree].

Five gif files have been removed from the site, two were reinserted into this blog. The attached graphic shows monthly traffic to the web site for 2009. In addition to the page losing the graphic and seeing the size reduction this blog gets a link from the web site indicating the new location of the graphic. The 5 pages also no longer require another DNS look-up because the graphics were out on Google Picasa. Now the FAQ pages never received that many hits and the gif's were out on Picasa so my server sees no change. However Google will see the loss of a DNS look-up and the disappearance of five 80K Byte pic files.

In addition to removing those pic files I also reduced the size of another 14 gif files, saving between 20 and 30K Bytes per file. Yet another small change, but these files were local so the server will also see a reduction.

Any single change is small but the aggregate speed increase to the web site should help. I'll find out in a few weeks when Google up-dates the Site Performance report again. It's hard to tell, but this is Search Engine Optimization [SEO] because Google uses down-load time as part of its Page Rank algorithm.

Thursday, April 01, 2010

SEO stuff really work

The numbers are in from last month, so I guess I should post them. The numbers seems to be on the increase, more than I figured. Although I can always predict the out-come based on the average numbers ~ maybe 10,000 visits per day, and 5,000 on the week-ends. However I did not expect to see the large jump in page views, which is finally up to the numbers back in 2006. I guess the trick now is to keep the numbers up there~

How to read the data:


Server Bandwidth:
The lowest curve is server bandwidth and does not relate to the other numbers on the chart. The bandwidth is hovering around 148,000 [in the graph] but really equates to 14G Bytes as the numbers were changed to fit the graph. I track bandwidth just to make sure the server does not see a heavy load.

Unique Visits:
Are visits from a computer within a month, but any one computer is only counted one time. If any one computer returns for a second visit it's counted by the Visits curve.

Visits:
A site visit is registered each time a person visits the site within a month and each time the person returns to the site. Site Visits should always be equal to or greater than Unique Visits.

Page Views:
Are the number of pages a person views per month, regardless of how many times the visitor returns to the web site. Page Views should always be equal to or greater than Site Visits. Page views are really the only data point that is falling. Page Views is related to Bounce Rate, which is the percentage a person visits one page and then leaves the site.

Another way to see the same data, as site visits, or number of visits ~ so a comparison can be made year over year. This chart makes it easy to see that site visits are higher than any other month and any previous year.
2005 was the year I started to follow Search Engine Optimization [SEO] techniques. I guess the SEO stuff really works.

Monday, March 29, 2010

Picasa Web Albums

So it would appear that Google's Picasa Web Albums is off-line, and I'm not really sure how long they've been down. The web site [Engineering Portal]  uses a lot of grapics and pic files, but not all of them are local to my server. I off-load some of them to Picasa to keep the server bandwidth down, which runs around 13G Bytes a month. Right, if someone else is serving the picture files then my bandwidth is not effected.

It's hard to say how many pic files reside on my server, as they could still reside in one of my directories but not used any more. So the count is an estimate but it would appear there are 687 picture files local to my server [give or a take]. Picasa on the other hand is serving another 919 graphic files [or not]. Because it appears that currently there are over 900 pic files not showing up on the web site.

So is it a good idea to up-load your files to another server? Well if your doing it to shows friends, sure. But what about if your trying to run a business? I guess I don't have an answer, but I am saving on my bandwidth. It's not saving me any money because my bandwidth limit is much higher than 13GB. What I am saving, or enhancing, is page load-time [I hope]. If the page is downloaded from my server and a pic file is downloaded from Picaca the visitor should see the page render faster. Or what if the files were on my server, what would be the bandwidth then?
Maybe a standard graphic file is 10 to 20K, and there are over 360,000 page views a month [over 380,000 page views this month]. That seems like a lot of down-loading [saved]......

So I see that the graphics are down in blogger as well [this blog]. I guess that makes sense because Google stores the blog graphics in Picasa too. I was going to attach a graphic showing server bandwidth vs page views, so that will have to wait. I'll add a link to this posing as a comment a bit later [SEO Techniques]. And it seems like just a few days ago the web site was off-line for a few hours too.

Oh, SEO stuff; the website link 'Engineering Portal' points to the normal site, I'm just using a different term to describe it ~ for the search engines. To try and insure the words are assigned to my home page , or related to...

Saturday, March 27, 2010

Site has been assigned special crawl rate settings

So I just checked Google Web Master Tools to see how the site was doing [looking for any issues]. Plus I had just up-loaded a new xml sitemap. The last update of the xml site-map was in December, so this new one covers the 20 odd pages added in the last few months.

Anyway, because the site was off-line for a few hours the other day, I checked the crawl rate. I no longer have the option of changing the Google crawl rate [for Googlebot]. Instead of a crawl faster or crawl slower selection I see this message;
Your site has been assigned special crawl rate settings. You will not be able to change the crawl rate.

Under the Crawl errors page I don't see any issues. There are 16 pages not found, but those are all mis-spelled page addresses from other web sites [incoming links], which I can't do any thing about. So I check how GoogleBot is crawling the Engineering web site, all looks well. In fact it seems that over the last three years around 500 pages are crawled per day. Here is the site crawl rate history over the last few years [Engineering Blog with a search term of Crawl]. So what is the deal and what does Special Crawl Rate mean?

I tried a Google web search for the terms 'assigned special crawl rate' and I get a Google News group with dozens of people asking the same thing. Well News-Groups are not that great a place to get information, only because so many people post a reply, just to post with out ever answering the question. Some people said it was for large sites, others replied that they had a small site. Then people would say that's why 1000's of their pages are not indexed [which is different all together].

So I can't answer the question of why your site has been assigned a special crawl rate, because I don't know why my site received it. But I can say that at least for my site, there has been no change in the crawl rate ~ all the way back to 2007. Click the crawl rate graphic for a larger image for year-to-date crawl rates.

I could add to that and say my site does not even require a sitemap, any new page gets spidered with in about a month. Of course this blog gets spidered today, remember blogger is owned by Google. Oh if I didn't already say Google indicates 1,653 urls submitted in the xml sitemap, with 1,322 urls indexed.
Attached graphic; Google Crawl Rate for January 2010 to March 2010.

Saturday, March 13, 2010

PageViews performance

Yep I went out and looked at the performance of the new pages that have been generated so far this year. In most cases the page views are very low, maybe one or two a day or no views at all.

I opened three or four of the different pages and added a bit more text when possible. But a number of these pages were generated based on a graphic. So either the text is already embedded in the picture or the graphic doesn't really require any additional text. So I'm kind of stuck, the pages are un-fixable  there's really nothing wrong with them. Except for the fact that the page bring in zero traffic.

Now I know I need text on a page to bring in traffic from the search engines. It's standard Search Engine Optimization [SEO] stuff, day one. But these pages were generated around a graphic file, little text required. Maybe I should stop generating a new page just because I have a picture file I want to use. Still I'm not even getting hits from people already on the site [engineering], nor am I getting any traffic from people doing an image search.

The question is what to do? Some of these pages were generated 3 months ago and have only seen a dozen hits.
1. Well I added some additional text to a few of the pages.
2. I also commented on a few of the Blog pages that added them. I always comment on my own postings, to indicate updates or changes in the original posting. Of course the comment enhances the blog page, because more text is added. Remember a blog post is also a web page.
3. Then there is this blog posting, with links to the pages that need help.

Now I talked about this same issue last December [New Page Generation and Page Views], only how new pages did over the entire year. Normally I never care about a page until it's at least 3 months old, which some of these are. But these page views are so low there's just no getting around the fact that they most have some kind of issue.

Panel Mount LED. Holds a graphic of a few LEDs. Added 1/7/10, zero page rank
Capacitor Networks. Graphics and a bit of text. Added 1/15/10, zero page rank
Via Stubs in PCBs. Definition of a Via Stub. Added 1/17/10, zero page rank
Jumper Headers. Holds a graphic of some jumpers. Added 12/31/09, zero page rank
These pages have only received between a dozen and 2 dozen page views so far, subtract a few because of me. Oh and page rank doesn't mean any thing, but it wouldn't hurt if they get one soon. Of course there are more pages in the same boat, but why fill the blog post with a bunch of page links.

Tuesday, February 02, 2010

SEO Techniques that work

I must be doing something right, as last month had the highest numbers of both Unique Visitors and Visits [same person with more than one visit].

In addition to adding content and a few dozen new pages over the last few months, I have also been doing Search Engine Optimization [SEO].

As I've indicated in previous posts the SEO techniques primarily include adding 'alt' tags to pic files and 'title' tags to internal links. Many pages already had the html tags, but many pages did not have the enhancement, with some pages only having part of them. I've been making these enhancements over the last six months.

So my advice is to insure that every pic file has an 'alt' tag that describes what the picture is. In many cases a pic file also has a 'title' tag which describes the picture in greater detail. At the same time any page-to-page link also gets a 'title' tag, if it didn't already have one.

Now you could say that the increase in visits is due to content additions, or from adding new pages. However I would disagree, because although I have been adding a lot of new content, the per page content added is small. Or, much of the content added over the last six months is spread out over several hundred pages.

Secondly, the amount of new pages added over that time frame is also small, say around 61 pages in six months. Now you could say that was a great many additions to the site, but I'll try to indicate why it's not.

First off , any new page added last month [and December too] are still in the Google sand box and is irrelevant [Trimmer Resistor; random example]. Also any page added last month may not have even been completely spider-ed. So we'll forget about the 22 most recent page additions. November saw a lot of Component Package additions, pages holding gif files with little or no text ~ another 17 pages. October had a few 'How-To' additions which receive no hits to speak of ~ 8 pages. Same thing with September, just a few BJT Outline drawings which will receive zero visits ~ 4 pages. Now we're back to August, which is about the time frame that should start to matter, but again a few 'How-To' additions ~ 3 pages.

So I would discount 54 of the last 61 new pages as not adding anything to the site yet [really all of them]. I always assume 15 to 30 days before Google spiders a page. Than another 30 days before Google reads the page completely, still another 3 months after that before a page gets a Page Rank [if at all]. I'm not saying these pages aren't getting any page views just that the few dozen page views they are getting wouldn't show up as a change on the graph.

I contend that the increase in visits is due almost entirely from optimizing the pages for the search engines. Really, I've been adding pages to the site for years and I don't see any sustained increase in visits ~ but I do now. Search Engine Optimization has been the biggest change I've been making to the greatest number of pages over the last few months. The last few months are the highest for the site ever, not counting December [always a low month].

Now these 'alt' or 'title' tags also help site visitors, either by showing text if the gif doesn't come in or by providing detail if they hover over a link ~ it's win-win.

Now there is always one more point some would could make about the increase in page visits. That is the Google computer pushed this site up near the top of the search listing while another site moved to the second page. That argument could work for one month, but not for the last four months. As Google re-orders their listing every month. More importantly, we're talking about 1,600 individual pages, Google didn't push all of them to the first page!

Finally my page on Google Sites [Thermal Impedance ~ random page] is bringing in about 60 people/month, 73% of which are new to the site, word of mouth traffic? Googl Knol is bringing in yet another 60 visits per month at 57% new visitors [Component Derating ~ just a random page]. However neither of those referrals would seem to add up to the 10,000 additional visits which occurred last month, but it doesn't hurt either.

Right, we're trying to explain 20,000 new visitors, and the only thing I can say other then it is due to the SEO changes is that some how the few hundred visitors due to page additions or referrals adds into the thousands ~ I don't think so. Add the html TAGs.....

Sunday, January 31, 2010

Page Optimization

I'm always adding new content to the web site [Engineering Buses], either by inserting additional content to existing pages or just by adding new pages. So far this year 10 new pages have been added to the site, I don't track content enhancement. However I can tell how many pages have been updated, either from SEO practices or content additions, from the modified date on the PC I can tell how many pages have changed..
Jan; 327 page updates.
Dec; 328 page updates.
Nov; 389 page updates.
Oct; 296 page updates.
And so on, like I said I can't tell why a page was updated. However I could say that a page either received an embedded enhancement to help the search engines or had additional content added, and maybe an SEO change. Normally when I fix a page for optimization reasons it just means I added 'alt' or 'title' tags to links or gif files already present on the page.

I also add graphic or pic files all the time. Currently my PC indicates over 1,600 pic file [in a few different formats, gif, jpg..], however some may not be used but still reside on my PC. Maybe 1,500 different pic files and over 1,851 html files ~ both growing all the time.

So this month will have the highest number of visitors ever, not even counting today [numbers aren't in yet]. Here are the best 5 months [Number of Visits];
Jan 2010: 234,085
Oct 2009: 223,689
Nov 2009: 223,327
Mar 2009: 220,620
Jan 2008: 217,694
Graphic; Google Crawl Stats, last three months. At an average of 519 pages crawled a day, you would think that Google would find the 300 odd pages that are up-dated each month.

Thursday, January 28, 2010

URLs Indexed by Google

This is the big question, right?
How do I get my pages indexed by Google?
I would say that the bigger question is how do I keep my pages index by Google.
Oh and the answer to either of these questions is of course '42'.

What I just noticed is that incoming links from other sites that are not indexed do not count as an incoming link. Maybe I already realized that fact, as I've seen many pages over the years fade in and out of the index. That's one of the many reason a page rank will go up or down, as pages fade in or out.

So using Google's Webmaster Tools to check the status of incoming links to pages, I found this; Most of the pages relating to the Dictionary of Capacitor Terms [topic page with the lowest page views] only had one external link pointing to them. Well that might be ok, but basically all pages should have at least one external incoming link, from my sitemap [interfacebus Site-Map]. The sitemap is located out on 'Google Sites' which is external to interfacebus.com. The difference here is that I know that the Capacitor Terms were only added to the site a few years ago. All new pages (page additions), in the last few years, get blogged about in the 'What's New Blog' [Engineering Pages]. So if I blogged about adding Capacitor Definitions [blog 11/17/07; Electronic Capacitor Dictionary], why is the external link from blogger [owned by Google] not being recognized? It certainty can't be because Google has not found it, it's been out on the web for two years.

Now that blog page, providing all the links, does have a page rank of zero [really no page rank available]. So can I also assume that the page is also not indexed in Google's listing?
Now I see another blogger page adding two new pages to the Capacitor Terms [blog 10/26/08; Dictionary of Capacitor Terms], that blog page also has no page rank and the two new Capacitor page links are also not recognized in Webmaster Tools. I blog about new pages for two reasons; first to show visitors what new pages have been added and to set the new page up with an incoming external link [SEO stuff]. Standard Search Engine Optimization [SEO] says to try and get external links pointing to your pages, well that's what I was trying to do. But if the pages I use go to a page rank of zero once they fall of the front of this blog than it's kind of pointless. I could add a 'history' page on the site, pointing to these blog pages. That would give the blog listings a page rank. Seems like a lot of effort, and kind of hard to keep track of [with all the different topics on the site]. Of course just by blogging about the other blog pages help [if that makes any sense]. I'll have to think about what SEO advice to give. However what I can say is that just because you blog about a page it really doesn't mean that the benefit will last more than a few months. So you have to find a way to keep a page ranking for pages that are used as 'external links'. One way would be to cross link, as I just did with this posting [pointing to the older blog posts]. Of course another way would be to re-post the pages, which I'll do now to make a point. However I don't recommend listing you page links over and over again in a blog ~ why would any one read the blog.......
Capacitor Dictionary pages, listed by the first term on each page.
Air Dielectric definition. [Page 1, main topic]
Capacitor Breakdown definition. [Page 2]
DC Leakage definition. [Page 3]
Electrode definition. [Page 4]
Farad definition. [Page 5]
Ganged Tuning definition. [Page 6]
Impedance. [Page 7]
Capacitor Manufacturers. [Page 8]
Paper Capacitor definitions. [Page 9]
Quality Factor definition. [Page 10]
Self Inductance definition. [Page 11]
Tantalum Electrolytic Capacitor definition. [Page 12]
Values, Capacitor definition. [Page 13]
..... This may appear to be a long post, but it's an important SEO topic that needs to addressed.
You'll note that I link to the Capacitor pages with a key word, and not just page 8 [for example] that means nothing to Google. Of course I also updated any of these pages as required.
Graphic; Leaded Chip Capacitor, Through-hole Capacitor.

Wednesday, January 06, 2010

Google Sites [Nofollow]

Although I do like Google Sites I did want to mention one particular issue with using the free service, which some people may find important. But first a few good things about Google Sites.

First off Google Sites is free to use, there are no fees or costs involved with the service.
Second, it's easy to generate web pages ~ like this one [ http://sites.google.com/site/interfacebus/Home ].
I think they call them web pages, unlike a web site. I assume because you can only integrate the web pages together in a particular way. Of course you can upload and integrate pic files and text.
Finally you can use your Adsense code, or run Google Ads.

So what is the problem with Google Sites; well Google does not follow the links on the site. Every link you add to your web pages gets a HTML tag called no-follow [ rel=nofollow ]. The nofollow tag tells every search engine not to follow your link and in Google's case not to give the page it points to any Page Rank.

You can use the site to generate and run a web page, but you can not use the site to promote another web site [with a search engine]. So you can start a web page there and point your visitors to another web site, but you can't point any search engine to a different site because the links don't work.

Last year I would have advised someone just starting out to begin with Google Sites, just to get off the ground and get a Page Rank. Now because you can't pass a Page Rank, I would advice people to bit the bullet and buy a web address and rent server time.  It's just pointless to spend time working a web page if it doesn't help with Page Rank for any other web page generated sometime later. The best example; you want web presence now and six months later you come up with a business name. When you do buy that web address [your-site.com] no one points to it and it has a zero page rank [so you start from scratch again].

Now when Google Knol started they also used the nofollow tag for the first three months. However because of so many complaints they dropped it several months in. So now links from any Google Knol you write do work with search engines. Example Knol I started on Component Derating.

Graphic; Panel Mount LED, Red color.

Tuesday, December 22, 2009

SEO Best Practices and Tips


So first I took a look at the pages that receive the lowest page views. Most of these are new pages, so it's ok that they receive a low amount of views. However some pages are much older but still receive no page views;

This one is a new page generated in October, still no Pagerank and just 11 page views?
PWB Types. It has an internal link and two external links, one from the sitemap, and one from the 'Whats new blog' Definition of PWB Terms. There is little else I can do but wait either for more page views or maybe a Pagerank.

This next page is more than a  year old about Operating Temperature of a 2N2906 transistor. Looks like it was just updated last month [not that I recall]. Again no page-rank but it has multiple internal links and at least one external link. The point is that you can't make people visit your page, no matter how much work you put into it. I assume the pagerank went to zero because Google knows no one ever views the page. Or the text is to similar to another page on my site.

Along the same lines, I'll also check the number of incoming External links; this related page only has 3 external links: Derating an NPN Transistor. This particular page has a ranking of one, again more than a year old. Yet one more; no page rank and only 33 page views [Derate a 2N3765].

I spent a ton of time generating these derating pages and they never generated any traffic.

So what can you do to get a page working. First I just updated two of the pages. Second, I added a new external link to the pages via this blog posting. Finally I generated a new way to find the pages via the text on this page, or via the graphic I attached here. Oh you may have missed it but I also linked to an internal page of the other blog, that way those pages just don't go to a zero page rank after a few months.

The other thing I wanted to mention was that by adding an external link to a page and increasing the pagerank of that page, any page it may link to also sees an increase in pagerank. This last page already has a page rank of two, but only two incoming links. Quick Connect Resistor, now it has 3 links and is associated with the term "Quick Connect". In fact each of these links now associate the page(s) with a new term.

Graphic; Chip resistor, Thermistor, Derating Curve.

Saturday, December 19, 2009

Why do I need an XML Sitemap


I really don't know why I need an XML site map for the search engines, but I just generated one. The last sitemap I generated was just about a year ago; the file was always left out on my server. In fact Google would come out and read the sitemap a few times a month.

Any way the XML site map differs from the html sitemap I generated yesterday; the HTML file is for people, while the XML file is for the search engines.

Just like last year I used GSiteCrawer. The program had crashed several months ago and I finally got around to taking a look at it, turns out I just downloaded it again to get it to work.

So the point is why do I need this file? The thing is the file size is 291k bytes and just burns up bandwidth off my server.

As indicated in the 'New Engineering Pages' blog a new page covering Feed-Through Capacitors was added [to interfacebus.com] at 7 PM last night. When I check at 5 AM today, the page had already been indexed by Google. In fact doing a search for that term returns my page at 49 out of 473,000 results. err, I didn't need a site-map to get a new page indexed hours after I wrote it ....

Whatever; Google indicates that 1,607 URLs were submitted [via the XML file] of which 1,481 URLs [pages] are indexed in their results. Now this differs a bit from the copy it replaced which showed 1,859 pages submitted and 1,328 pages indexed ? My computer also indicate more than 1,600 pages.

Anyhow this week I generated a new human readable sitemap and a new machine readable sitemap ~ that's a good thing.

I don't get it, but I would recommend you generate an XML sitemap for your site [if it's a tad large], many of the generators are free to use or download. You don't even have to tell Google, just add a bit of text to your robots.txt file which all the search engines read. My robots text file is blank right now, but only because I didn't want the other search engines downloading a year old file. But the command is: http://www.YOU.com/sitemap.xml ~ a line of text showing the location of the xml file.

I need a few lines of SEO link building:
2N2904 Derating Curve. Transistor
2N2906 Derating Curve. Transistor
2N6760 Derating Curve. Feild Effect Transistor

The graphic above is a test circuit for testing saturated switching time of a transistor.

Monday, December 07, 2009

How to get more external page links


This sort of follows yesterdays post on page-views. This time the topic is on external page links. An external link is a site other than interfacebus that links to a particular page.

A page with low external pages linking in will receive a low page rank, and so will receive a lower amount of visits. So this is a list of a few pages with only one external page link, and that also has a page rank of zero [in most cases]. In most if not all cases the only external link pointing to the site would be the sitemap for this website.
Definition of Antenna Terms. ['P']
Definition of Radar Terms.
Engineering Acronyms. [He]
Dictionary of Terms.
Capacitor Terms.
VXI Board Manufacturers.
Data Highway.
And on and on it goes. I'm not sure why so many pages now have only one external link. I'll need to keep watching this issue.

So by listing a few of these pages here, they receive another incoming external link. Maybe some day they may even also receive a page rank. Of course if they do get a page rank than they may pass some of that page rank to another page that they point to.

The graphic is Visits from Denmark, so far this year [because of the Global Warming meeting].