This article will discuss performance differences between a LINQ loop and a regular For loop, as a way to practice using Visual Studio 2008, Linq and unit tests for the first time.
LINQ (Language INtegrated Query) is Microsoft's new .NET addition to the language and allows formulating queries in an SQL-like syntax. It's particularly useful when traversing data sets, XML DOM trees and collections.
I first came across Visual Studio 2008, and LINQ in particular, during this year's Tech Ed at Orlando.
A nice developer from Microsoft described and demoed it for me (at that time, it was only available for VB.NET, but a C# version came out with beta 2). That developer insisted that LINQ is not just for data sets or complex objects, but can also be used for simple loops.
I decided to put his theory to the test, as a way to learn the new technology: I wrote a simple program that searches for odd numbers in an array, and compared the time it took a regular loop to the time it took a Linq loop, to come up with the right answers.
While this article was written several months ago, I waited for the RTM version of VS 2008 and .NET 3.5 to arrive, before publishing it.
Along the way, I got to learn LINQ more deeply and using VS 2008 unit testing capabilities.
As the project grew, I've added a third loop (ForEach) to the mix. I then decided to output it all to a CSV file and analyze the results in Excel.
- The program allocates an array, with n elements and fills it with numbers
- It then calls each of a
GetAverage on each of the 3 functions
GetAverage calls each function a 1000 times (configurable)
- It measures the time it takes a function to go through n elements and calculates an average.
Measurements are derived using Daniel Strigl's High Performance System Timer
- Averages to n elements are displayed (or outputted to a file) for each function
- the entire code runs 5 times, to ensure average consistency
Using the Code
This is a bare-bones application. It runs as a console application and has no UI.
The only configurable parts in it are:
numElements - how many elements in the array
numIterations - how many times is each algorithm called, to calculate an average
- Application output can go to a file or to the standard output (screen) - just comment the right lines
As you can see, all it does is call the
GetAverage function, passing the algorithm function as a parameter.
GetAverage looks like this:
As you can see, not too complicated: it starts a timer, calls function
f() stops the timer and accumulates the time. It does so
numIterations times and returns the average. I really liked the idea of submitting a function name as a parameter, as it abstracted the design and will let me build on it in the future.
Essentially, all 3 functions use simple O(n) search algorithms: The
GetOdd is the most straightforward:
and finally, the
GetOddLinq using the new Linq syntax:
You first notice the new keyword
select is in the end), but it's still readable.
For more on Linq's syntax and samples, try the official LINQ project page.
I've run this program on several computers and VMs. I've tried it on Windows XP, Vista and 2008 RC1. I tried running it on a busy machine, or on a completely vacant machine. Finally, I've tested debug and release versions. The numbers may change, but the trend remains the same:
Measurements are in seconds. Column E shows the percentage of time added by Linq compared to For: Fi = (Di - Bi)/Di.
Of course, once you have the raw data, you can analyze it however you want, such as generate a graph:
Note: as mentioned results have been pretty consistent, and Linq had 75-85% overhead, in almost every test. But in debug version, LINQ took even longer to complete the task, while For and ForEach remained essentially the same.
My only guess is that LINQ has some instrumentation built into it, to allow for easier debugging — thus it's slower in debug builds.
A huge chunk of the Tech Ed sessions was dedicated to testing and in particular, how easy it is to add unit tests in VS 2008. And indeed, it didn't take long. Right click anywhere in the source and select "Create Unit Tests...". A wizard will take you through selecting the functions you want to test in your project and would eventually create a test project and add it to the solution.
The test projects comes ready with the right references and a set of accessors — allowing the unit test functions access to all members of the original class — even the private ones.
So how do you test this code? Here's the unit test for the function that creates the array:
[DeploymentItem("</span />LinqTest.exe"</span />
Pretty simple, isn't it? Essentially, you are using the
Program_Accessor to gain access to the LinqTest.exe assembly. Upon calling the FillArray function for n elements, you assert that the size of the array should now be n.
Now, let's test one of the search functions (the tests for all are the same — a unit test does not care about the internal logic of the function, just about the results).
[DeploymentItem("</span />LinqTest.exe"</span />
Here I cheated. Knowing that my array will be filled with consecutive numbers, I know that any 2 adjacent cells I pick will contain 1 odd number. So, we build a 2 cell array, fill it and compare the number of odd numbers returned from
GetOddLinqTest with the expected result. In a variant of the program, where the array is filled with random numbers, you'd have to change this function, to get the right
Note: random numbers will not change the measurement results, as we always have to scan the entire array.
Now, run all the unit tests prior to building the solution (or click CTRL+R,A) and, hopefully, you'll see all green:
Version 1.00 released on 12/5/2007
Version 1.01 released on 12/14/2007
Following the suggestions received in the comments, 2 corrections were implemented, to improve measurment accuracy:
- Per Dennis Dollfus's suggestion, the LINQ function now looks like this:
- Per kckn4fun's suggestion, results are accumulated into a
StringBuilder and only written to file/console in the end. The new
Main function looks like this:
The new results look like this:
As you can see, performance is slightly better — but the tren remains.
This program is, by no means, a thorough analysis of LINQ's general performance. I'm sure its behavior in traversing complex data sets and XML DOM trees is much better. I never set out to prove anything, just play a little bit with the new environment.
Feel free to use the program and its results however you choose. The way I designed it, it's easier to plug in more complex logic and still get measurements.