Using GsiteCrawler, I'm generating a new site map. I have also up-dated the on-site sitemap. Which is really located out on my Google pages site to save band width, plus that would give me an external site that points to my pages, helps with page rank.
The local site-map is used to help people find pages or to show the structure of page links on the site. When this map is generated it comes with a large number of duplicate page listings. Tonight I removed some [a few] duplicate listings and added links to the pages that have been generated within the last few weeks.
While the Google site map is used to inform Google of the pages that exist on the site. For Google's version, duplication does not matter ~ just that the spider sees a list of all the pages on my server.
I think it worked, I'm up-load the site map now, then clicking the button to indicate to Google it has a new site map.
Recommendations; always have an on-site site map This helps pages that are linked more than 4 levels down from the home page. With a site map off the home page, all pages are 2 levels down regardless of what they really are. A site map located off site [like mine] also makes it appear that an external page points to all your pages, that helps in page rank. Being off my server also helps with down load speed, as the site map which is still over 200k bytes is on another server.
2 comments:
Looks like it worked, Google indicates 1570 URLs submitted
indexed urls 1223. Hmm
Post a Comment