GZip – Saving the Internet, One Kilobyte at a Time

Entropic Memes runs on WordPress (behind a Squid cache), with a lot of performance enhancements stuck on. One of the more overlooked benefits is that of the lowly and humble GZip compression provided through PHP – in my case, it’s enabled server-wide. It can reduce the actual amount of data transferred by a quite remarkable amount – saving not only bandwidth, but time, as well.

While looking at the server logs this afternoon, three subsequent requests for the same page caught my eye:


www.slugsite.com - - [11/Dec/2007:12:12:42 -0600] "GET /archives/656 HTTP/1.0" 200 4169 "http://blog.wired.com/defense/2007/12/most-awesomel-6.html" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.11) Gecko/20071127 Firefox/2.0.0.11"

www.slugsite.com - - [11/Dec/2007:12:13:21 -0600] "GET /archives/656 HTTP/1.0" 200 14650 "http://blog.wired.com/defense/2007/12/most-awesomel-6.html" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; InfoPath.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)"

www.slugsite.com - - [11/Dec/2007:12:15:30 -0600] "GET /archives/656 HTTP/1.0" 200 4169 "http://blog.wired.com/defense/2007/12/most-awesomel-6.html" "Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.8.0.4) Gecko/20060508 Firefox/1.5.0.4"

Squid caches all the “static” elements of this website – the images, even the CSS – so the only thing that gets passed back to the Apache webserver is the request for the page itself – which is served up from the prerendered WordPress cache, thanks to WP-Cache2. (All the files and images in the content of posts are hosted on another server, elsewhere.) The first and third requests for this page were from people using Firefox. The second request above was from an Internet Explorer user – whose requested page was 14650 bytes, as opposed to just 4169 bytes for the Firefox users. That may not look impressive – but the Gzip’d HTML is just 29% the size of the original.

Obviously, the benefit is less dramatic when you factor in all the images and CSS that get downloaded, as well. However, Gzip also compresses my syndication feeds (by a whopping average 75%) – and when you’re talking about shaving 60KB or more off of something that gets requested several hundred times a day, those kilobytes begin to add up. Syndication is supposed to be the future of the internet – the whole point is to get the content, quickly and easily, without all the pesky other stuff like images and ads and things, after all. Not coincidentally, the content of most blogs and websites compresses beautifully. Most feed readers, and most of the “big” crawlers and syndication services – including Technorati – support Gzip, but a handful still don’t, which I think is silly.

I mean, come on – you’re saving 21KB by not downloading the images on a page of this website – which get cached, so you only have to download them once a session – and then you throw away three times that amount by not supporting compression? I don’t begrudge them their unnecessary traffic, but, sheesh, get with the 21st century already, people. Can’t you feel the overpowering disapproval of the BPS? Every bit as bad are those people whose websites don’t support compression. What, you think bandwidth grows on trees? There are starving file-sharers in Tajikistan who’d love to make use of all those bytes you’re needlessly throwing away. Enable Gzip already, or ask your host to. Do it for me, or do it for America, or do it for the Blogosphere. Heck, be a selfless bastard and do it for the kids in Tajikistan. Like Nike says, just do it!.

Published in: Geekiness, General, Meta | on December 11th, 2007| Comments Off on GZip – Saving the Internet, One Kilobyte at a Time

Both comments and pings are currently closed.

Comments are closed.