I can't remember if I've blogged about this or not, but I really regret updating my pages to XML. It increased the size of all my pages, increasing the download time and really hitting my bandwidth.
The so-called 'strict' coding approach is "BS", all those spaces that were added. On many pages the down-load time is doubled if your on a modem. Who came up with this idea?
Any way, the page still indicates it complies:
DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd
But, as I edit a page, I'm deleting some of those spaces ~ 1 byte per space. Maybe save a few 100 bytes a page [not removing all spaces, just a certain type]. What 130,000 visits x 2 page views x 300 bytes = 78,000,000 [78MB/month].
No wonder the internet is slow ~ it was conceived to be so
1 comment:
Also been removing this meta-tag;
meta name="robots" content="index,follow"
It's redundent, as the robots.txt files controls the entire site
Post a Comment