Article

# Best Practice No. 4: Improve bandwidth performance of ASP.NET sites using IIS compression

By , 4 Mar 2014
 Rate this:

Updated with links of .NET Best Practices No.: 1, 2, 3, 5, and video.

## Contents

Improve garbage collector performance using finalize/dispose pattern

## Introduction

Bandwidth performance is one of the critical requirements for every website. In today’s time, a major cost of the website is not hard disk space but its bandwidth. So transferring maximum amount of data over the available bandwidth becomes very critical. In this article we will see how we can use IIS compression to increase bandwidth performance.

Please feel free to download my free 500 questions and answers videos which covers Design Patterns, UML, Function Points, Enterprise Application Blocks, OOP, SDLC, .NET, ASP.NET, SQL Server, WCF, WPF, WWF, SharePoint, LINQ, SilverLight, .NET Best Practices @ these videos http://www.questpond.com/.

## How does IIS compression work?

Note: All examples shown in this article use IIS 6.0. The only reason we have used IIS 6 is because 7.0 is still not that common.

Before we move ahead and talk about how IIS compression works, let’s try to understand how normally IIS works. Let’s say the user requests a ‘Home.html’ page which is 100 KB in size. IIS serves this request by passing the 100 KB HTML page over the wire to the end user browser.

When compression is enabled on IIS the sequence of events changes as follows:

• User requests for a page from the IIS server. While requesting for the page the browser also sends what kind of compression types it supports. Below is a simple request sent to the browser which says its supports ‘gzip’ and ‘deflate’. We have used fiddler (http://www.fiddler2.com/fiddler2/version.asp) to get the request data.
• GET /questpond/index.html HTTP/1.1
Accept: image/gif, image/x-xbitmap, image/jpeg, image/pjpeg,
application/x-shockwave-flash, application/vnd.ms-excel */*
Accept-Language: en-us
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0
Host: www.questpond.com
Connection: Keep-Alive
• Depending on the compression type support sent by the browser IIS compresses data and sends the same over the wire to the end browser.
• The browser then decompresses the data and displays the same on the browser.

## Compression fundamentals: Gzip and deflate

IIS supports two kinds of compression: Gzip and deflate. Both are more or less the same, where Gzip is an extension over deflate. Deflate is a compression algorithm which combines LZ77 and Huffman coding. In case you are interested to read more about LZ77 and Huffman, you can refer to http://www.zlib.net/feldspar.html.

Below are the header details which are added to the deflate payload data. It starts with a 10 byte header which has a version number and a time stamp followed by optional headers for the file name. At the end it has the actual deflate compressed payload and 8 byte check sum to ensure data is not lost in transmission.

Google, Yahoo!, and Amazon use gzip, so we can safely assume that it’s supported by most browsers.

## Enabling IIS compression

Till now we have done enough of theory to understand IIS compression. Let’s get our hands dirty to see how we can actually enable IIS compression.

#### Step 1: Enable compression

The first step is to enable compression on IIS. So right click on websites -> properties and click on the Service tab. To enable compression we need to check the below two text boxes from the Service tab of IIS website properties. The below figure shows the location of both the checkboxes.

#### Step 2: Enable metabase.xml edit

Metadata for IIS comes from ‘Metabase.xml’ which is located at “%windir%\system32\inetsrv\”. In order that compression works properly we need to make some changes to this XML file. In order to make changes to this XML file we need to direct IIS to gives us edit rights. So right click on your IIS server root -> go to properties and check the ‘Enable direct metabase edit’ check box as shown in the below figure.

#### Step 4: Set the compression level and extension types

The next step is to set the compression levels and extension types. The compression level can be defined between 0 to 10, where 0 specifies a mild compression and 10 specifies the highest level of compression. This value is specified using the HcDynamicCompressionLevel property. There are two types of compression algorithms: ‘deflate’ and ‘gzip’. This property needs to be specified in both the algorithms, as shown in the below figures.

We need to also specify which file types need to be compressed. HcScriptFileExtensions helps us to specify that. For the current scenario we specified that we need to compress ASPX outputs before they are sent to the end browser.

#### Step 5: Does it really work?

Once you are done with the above four steps, it’s time to see if the compression really works. So we will create a simple C# ASP.NET page which will loop “10000” times and send some kind of output to the browser.

protected void Page_Load(object sender, EventArgs e)
{
for (int i; i < 10000; i++)
{
Response.Write("Sending huge data" + "<br>");
}
}

In order to see the difference between before compression and after compression, we will run the fiddler tool as we run our ASP.NET loop page. You can download fiddler from http://www.fiddler2.com/fiddler2/version.asp.

The below screen shows data captured by fiddler without compression and with compression. Without compression data is “80501 bytes” and with compression it comes to “629 bytes”. I am sure that’s a great performance increase from a bandwidth point of view.

## 0, 1, 2, 3, 4…10 IIS compression levels

In our previous section we set HcDynamicCompressionLevel to a value of 4. More the compression level value, the less the data size will be. As we increase the compression level the downside is more CPU utilization. One of the big challenges is to figure out the optimum compression level. This depends on a lot of things: type of data, load, etc. In our following section we will try to derive which is the best compression level for different scenarios.

## Three point consideration for IIS compression optimization

Many developers just enable IIS compression with the below default values. But default values do not hold good for every environment. It depends on many other factors the content type your site is serving. If your site has only static HTML pages then compression levels will be different as compared to a site serving mostly dynamic pages.

 Compression Options File Type Default Configuration File types compressed Static .txt, .htm, and .html Dynamic Compression schemes Static Both gzip and deflate Dynamic Both gzip and deflate Compression level Static 10 Dynamic 0

If your site is only serving compressed data like ‘JPEG’ and ‘PDF’, it’s probably not advisable to enable compression at all as your CPU utilization increases considerably for small compression gains. On the other side we also need to balance compression with CPU utilization. The more we increase the compression levels the more CPU resources will be utilized.

Different data types need to be set to different IIS compression levels for optimization. In the following section we will take different data types, analyze them with different compression levels, and see how CPU utilization is affected. The below figure shows different data types with some examples of file types.

## Static data compression

Let’s start with the easiest one, static content types like HTML and HTM. If a user requests for static page from IIS with compression enabled, IIS compresses the file and puts it in the ‘%windir%\IIS Temporary Compressed Files’ directory.

Below is a simple screen which shows the compressed folder snapshot. Compression happens only for the first time. On subsequent calls, the same compressed data is picked from the compressed files directory.

Below are some sample readings we have taken for HTML files of size ranging from 100 KB to 2048 KB. We have set the compression level to ‘0’.

 Actual KB Compressed in KB 100 24 200 25 300 27 1024 32 2048 41 Compression level set to ‘0’

You can easily see that with the least compression level enabled, the compression is almost five times.

As the compression happens only for the first time, we can happily set the compression level to ‘10’. The first time we will see a huge CPU utilization, but on subsequent calls the CPU usage will be small as compared to the compression gains.

## Dynamic data compression

Dynamic data compression is a bit different from static compression. Dynamic compression happens every time a page is requested. We need a balance between CPU utilization and compression levels.

In order find the optimized compression level, we did a small experiment as shown below. We took five files in the range of 100 KB to 2 MB. We then changed compression levels from 0 to 10 for every file size to check how much the data was compressed. Below are compressed data readings in bytes.

 File size Compression Levels 100 KB 200 KB 300 KB 1 MB 2 MB 0 32,774 35,496 37,699 52,787 109,382 1 30,224 32,300 34,104 46,328 92,813 2 29,160 31,004 32,673 43,887 87,033 3 28,234 29,944 31,628 42,229 83,831 4 26,404 27,655 29,044 34,632 44,155 5 25,727 26,993 28,488 33,678 42,395 6 25,372 26,620 28,488 33,448 41,726 7 25,340 26,571 28,242 33,432 41,678 8 25,326 26,557 28,235 33,434 41,489 9 24,826 26,557 28,235 33,426 41,490 10 24,552 25,764 27,397 32,711 42,610

The above readings do not show anything specific, it's a bit messy. So what we did is plot the below graph using the above data and we hit the sweet spot. You can that see even after increasing the compression level from 4 to 10 the compressed size has no effect. We experimented this on 2 to 3 different environments and it always hit the value ‘4’, the sweet spot.

The conclusion we draw from this is, setting value ‘4’ compression level for dynamic data pages is an optimized setting.

## Compressed file and compression

Compressed files are files which are already compressed. For example, files like JPEG and PDF are already compressed. So we did a small test by taking JPEG compressed file and below are our readings. The compressed files after applying IIS compression did not change much in size.

 Actual compressed file size File size after IIS compression 100 102 220 210 300 250 1024 980 2048 1987

When we plot a graph you will see that the compression benefits are very small. We may end up utilizing more CPU processor resource and gaining nothing in terms of compression.

So the conclusion we can draw for compressed files is that we can disable compression for already compressed file types like JPEG and PDF.

## CPU usage, dynamic compression, and load testing

One of the important points to remember for dynamic data is to optimize between CPU utilization, compression levels, and the load on the server.

We used WCAT to do a stress test with 100 concurrent users. For file size ranging from 100 KB to 2 MB we recorded CPU utilization for every compression level. We recorded processor time for the W3WP exe using a performance counter. To add this performance counter you can go to process -> select processor time -> select w3wp.exe from instances.

 File size CompressionLevels 100 KB 200 KB 300 KB 1 MB 2 MB 0 1.56 1.56 1.56 4.69 4.00 1 1.56 1.56 1.56 4.69 4.00 2 1.56 1.56 1.56 4.69 4.30 3 1.56 1.56 1.56 4.69 4.63 4 1.56 1.56 1.56 4.69 6.25 5 3.00 2.00 1.56 4.69 7.81 6 3.45 2.40 3.13 4.69 9.38 7 4.00 3.00 5.00 6.25 43.75 8 5.60 3.50 8.00 15.62 68.75 9 6.00 5.00 9.00 25.00 87.50 10 7.81 6.25 10.94 37.50 98.43

If we plot a graph using the above data we hit the sweet spot of 6. Till IIS compression level 6, CPU utilization was not really affected.

## TTFB and Compression levels

TTFB, also termed as time to first byte, gets the number of milliseconds that have passed before the first byte of the response was received. We also performed a small experiment on 1MB and 2 MB dynamic pages with different compression levels. We then measured the TTFB for every combination of compression level and file size. WCAT was used to measure TTFB.

 File size in MB CompressionLevel 1 2 0 8.00 9.00 1 8.00 9.00 2 8.00 9.00 3 8.00 9.00 4 8.00 9.00 5 9.00 9.00 6 12.00 17.00 7 16.00 18.00 8 19.00 19.00 9 22.00 37.00 10 29.00 47.00

When we plot the above data, we get value ‘5’ as a sweet spot. Until the value reaches ‘5’ TTFB remains constant.

Print shot of WCAT output for TTFB measurement:

## IIS 7.0 and CPU roll off

All the above experiments and conclusion are done on IIS 6.0. IIS 7.0 has a very important property, i.e. CPU roll-off. CPU roll-off acts like a cut off gateway so that CPU resources are not consumed unlimitedly.

When CPU gets beyond a certain level, IIS will stop compressing pages, and when it drops below a different level, it will start up again. This is controlled using the staticCompressionEnableCpuUsage and dynamicCompressionDisableCpuUsage attributes. It’s like a safety valve so that your CPU usage does not come by surprise.

## Conclusion

• If the files are already compressed do not enable compression on those files. We can safely disable compression on EXE , JPEG , PDF, etc.
• For static pages, compression level can be set to 10 as the compression happens only once.
• Compression level range can be from ‘4’ to ‘6’ for dynamic pages depending on the server environment and configuration. The best way to judge which compression level suits best is to perform TTFB, CPU utilization, and compression test as explained in this article.

If you want to do a sanity check please refer to this article: http://weblogs.asp.net/owscott/archive/2009/02/22/iis-7-compression-good-bad-how-much.aspx. I agree my results do not match exactly with Scott's but I think we are very much on the same page.

## Some known issues on IIS compression

Below are some known issues of IIS compression:

## Thanks, Thanks, and Thanks

Architect http://www.questpond.com
India

I am a Microsoft MVP for ASP/ASP.NET and currently a CEO of a small
E-learning company in India. We are very much active in making training videos ,
writing books and corporate trainings. Do visit my site for
.NET, C# , design pattern , WCF , Silverlight
, LINQ , ASP.NET , ADO.NET , Sharepoint , UML , SQL Server  training

 First PrevNext
 IIS 8 equivalent? JamieMeyer01 29-Mar-14 8:17
 My vote of 5 Oshtri Deka 11-Mar-14 22:27
 My vote of 5 S.P.Tiwari 4-Mar-14 23:56
 My vote of 5 Rupesh Kumar Tiwari 20-Jun-13 4:28
 My vote of 4 BadassAlien 9-Nov-12 8:15
 My vote of 4 ZeroDotNet 19-Sep-12 3:10
 My vote of 5 Milan Mathew 14-Aug-12 1:27
 My vote of 4 Vitaly Tomilov 24-Jun-12 11:25
 My vote of 4 onurag19 19-May-12 8:14
 Question of operation with SSL icestatue 2-Oct-09 2:59
 Re: Question of operation with SSL Shivprasad koirala 2-Oct-09 4:07
 Really nice Md. Marufuzzaman 29-Sep-09 5:11
 Very interesting article! DrABELL 16-Sep-09 11:52
 After a long time seen a good article roshanthapa 16-Sep-09 7:10
 Shared hosting Server cooolguymca 16-Sep-09 3:31
 Re: Shared hosting Server Vasudevan Deepak Kumar 1-Oct-09 1:47
 You didn't cover caching Andre Luiz V Sanches 16-Sep-09 1:23
 Re: You didn't cover caching Shivprasad koirala 16-Sep-09 1:37
 This is great and top class ravindrasinghal 15-Sep-09 22:32
 List of other articles Colin Angus Mackay 15-Sep-09 21:35
 Re: List of other articles Shivprasad koirala 15-Sep-09 22:25
 Re: List of other articles Vasudevan Deepak Kumar 1-Oct-09 1:49
 Unable to set compression Member 1072719 15-Sep-09 20:20
 Last Visit: 31-Dec-99 18:00     Last Update: 24-Apr-14 3:35 Refresh 12 Next »