To be on the web with a website that is half successful, you can feel in your wallet pretty quickly. The bandwidth is often a cost factor in the price of a website. Limiting that bandwidth is possible through a few compression standards. I was looking for a way on how to compress my ASP.NET pages when I found the module of Ben Lowery which can be found here. But his implementation never worked properly, not in .NET 1.1 and not in .NET 2.0, at least not in my case.
I downloaded his code and had a read through it, and it was completely .NET 1.1 style and still written like it was using the #ziplib. So, I rewrote his version to something that does work properly and also detects the path properly. In my case, whenever I tried to use his library, the output of my pages was empty.
Using the code
To use the code is pretty easy. Once you've compiled the
HttpCompress module, you can add a reference to any .NET 2.0 site. Next, you should set a few parameters in the web.config and you're good to go. If you want to see whether the content is really filtered, you will need something that logs the HTTP pipeline. I use Nikhil Kothari's WebDevHelper BHO to do it. An alternative would be to use Fiddler.
The first thing you can configure are paths and mime-types that are to be excluded from being compressed by the module. I, for one, do not compress any images nor do I compress streaming video etc. because these formats have already been compressed pretty good and you won't be gaining anything significant by compressing them. You are more likely to use more server resources for a small gain.
In the module, you need to hook up an event to the
PostReleaseRequeststate. Before, I used to hook it up to the
BeginRequest event but this executes way too early for the module to execute. At
PostReleaseRequestState, the whole page has executed and its response content has been generated.
41 public void Init(HttpApplication context)
- 07/01/2006: v.1.0 - Added content to CodeProject and wrote this article.