Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles
(untagged)

Performance comparisons LinQ to SQL,ADO,C#

0.00/5 (No votes)
25 May 2008 1  
To compare existing approaches of accessing data (arrays,tables etc) with the new ones release with c#3.0 and LinQ.

Introduction

Most you like me have been watching the release of VS 2008 from its beta versions. From the perspective of this article Microsoft added generics, query operators and LinQ support. I was always wandering why LinQ was added when the world has been comfortable with ADO programming with its support extended to objects and xml too; similarly use of generics/aggregations etc in C # 3.0 etc. With this came the purpose of this article. That is to compare existing approaches with the new ones release with c#3.0 and LinQ. As the scope of comparison is quite large I will restrict our discussion to

· Performance of reading data from SQL Server 2005 using ADO.net and LinQ using a Stored procedure

· Performance of reading data from SQL Server 2005 using ADO.net and LinQ by directly executing SQL statements.

· Performance of reading and manipulating xml files using existing c# approaches vs. LinQ approach

· Performance of accessing list of objects ( arrays in this article ) using traditional programming , LinQ and C# 3.0

· Performance of filling a dataset using ADO and LinQ and then performing a filer operation and then summing up something after filtering.

Before we jump into this article , I want to point out that there are many ways to accomplish a task programmatically, An effort has been made to write the best code possible , there are extra lines introduced so that to be fare in writing function for LinQ and ADO. As the code will be a shareware you are free to improve and build on it. The basic unit of measurement is counting ticks which have been used in this article ( System.StopWatch). Each piece of performance code has been run 500 times so as to get best performance values. Where ever possible I have added code to dispose/ garbage collect objects. All the code explained in this article is available in for download. The code is commented wherever required. Finally all the performance values are added to an Excel file Graphs.xls and visually shown by excel graphs.

Logic for running the function 500 times is given below

  1. create a dataset
  2. Start the stopwatch()
  3. run the test function ( under the ActualFunction #region)
  4. add the ticks to a data row
  5. stop the watch
  6. output the dataset to a xml and finally to the Graph.xls

The configuration used for getting the current performance data is given below

OS Name Microsoft(R) Windows(R) Server 2003

Ent Edition ,Version 5.2.3790 Service Pack 1 Build 3790

Processor x86 Family Intel ~2992 Mhz

Total Physical Memory 2,038.04 MB

Available Physical Memory 872.52 MB

Total Virtual Memory 2.85 GB

Page File Space 1.00 GB

Background

What you need to run the samples

  • VS 2008
  • SQL Server 2005 with sample database installed ( AdventureWorks )
  • Run the InsertSP.sql script on the Adventureworks database, this will create a simple stored procedure to inserting values to the Sales.Customer table.
  • After the run of the 2 sample console applications it will produce the following xml file for all the 500 passes for each comparison, data from these xml files is to be manually imported to Graphs.xls file.

Before you begin read the following

If you are not familiar with LinQ, new features of C# 3.0 please read the following urls

Using the code

1. Inserting a row …ADO vs. LinQ using a stored procedure.

Function used for ADO is ADOInserting() in RetrieveUsingADO.sln. The LinQ counterpart of the same is LinQInserting() in RetriveUsingLinQ.sln.

2. Inserting a row …ADO vs LinQ without using a stored procedure.

Function used for ADO is ADOInsertingDirect() in RetrieveUsingADO.sln. The corresponding LinQ function is LinQInsertingDirect() in RetriveUsingLinQ.sln.

3. ADO vs. LinQ Reading from a table.

Function used for ADO is ADOReading() in RetrieveUsingADO.sln. The corresponding LinQ function is LinQReading() in RetriveUsingLinQ.sln. Just to make the function do a little more task I have added code to add all the values of the first column after reading.

4. Reading xml files ADO vs. LinQ.

Function used for ADO is xmlReading () in RetrieveUsingADO.sln. The corresponding LinQ function is LinQXmlRead() in RetriveUsingLinQ.sln. These functions first read and xml file, then apply a filter in them (pass values > 250 in this case) and then add all the values of the first column.

5. Accessing Objects (Arrays in this case) [Traditional vs. c#3.0 vs. LinQ]

Function used for LinQ part is LinQObjects1 (), for the traditional c# part is csharpObjects1 () and for the c# 3.0 part is LinQObjects2 () in RetriveUsingLinQ.sln.

All the functions first create an array of integers, and then crate a second array of only even numbers with their squares. The last step in the functions is to add all the values in the resultant subset.

6. Filling Dataset using LinQ and ADO and then performing filter operations

Function used for LinQ part is LinQQueryDataset (), and that for the ADO part is ADODataSetQuery(). Please note that such database operation are resource intensive as the minimum counter value is 15 * 106 .

Points of Interest

I have referenced the median value rather then the mean to help reduce the effect of out-liars in the graph as in a Windows operation system there are always more processes running and spikes in the graph does not necessarily means a fault of the code.

· Inserting a row …ADO vs. LinQ using a stored procedure.

The median value of ADO insert is far more than that of LinQ. LinQ wins in this situation.

Image1.JPG

· Inserting a row …ADO vs. LinQ without using a stored procedure.

      • The median value of ADO is more than that of LinQ which suggests that LinQ is a winner in this situation

Image2.JPG

· Reading from a table (ADO vs. LinQ).

      • There is a vas difference between median values of LinQ and ADO as far as reading from table is concerned. ADO wins here , but is because of ADO.net maturity in the market and its tight connection with SQL server or because LinQ ( in my opinion ) created an overhead by creating the < IEnumerable> interface and an object if each item was draw in the LinQtoSQl dbml designer. For further improvement you should try with loosely typed datasets.Image3.JPG

· Reading xml files ADO vs. LinQ.

      • By comparing the mean values fro C# and LinQ, we have a close performance competition. There is only a small difference of 12304 ticks and by definition 1 second has about 1 billion ticks. LinQ marginally wins here.

Image4.JPG

· Accessing Objects (Arrays in this case) [Traditional vs. c#3.0 vs. LinQ]

      • Lets start with LinQ Obj2 , here the whole requirement of adding squares of only even numbers is narrowed down to one statement and that given below. The most interesting fact is that this function actually does not take any time to run! So c# 3.0 syntax wins here
        • double sum = nums.Aggregate(delegate(double Cursum, double curNum) { if (curNum % 2 == 0) { return (Cursum + (curNum * curNum)); } else { return (Cursum + 0); } });
      • Then comes LinQObj1 which runs a LinQ styled query on the array as given below. Please note that instead of the newFunction(temp) we could also have written temp%2 == 0, I just wanted to demonstrate using functions in the condition clause of the LinQ query.
        • var getSquaresLessthen500 = from temp in nums where temp == newFunction(temp) select temp*temp;
      • Last in the performance under this category is the traditional c# style syntax as given in csharpObjects1().

Image5.JPG

· Filling Dataset using LinQ and ADO and then performing filter operations.

And there is a big difference between mean values. I think that in the LinQ implementation the line where we create an object of DataRow and then add to the table is the place where performance is hitting. ADO implementation wins here.

table.LoadDataRow(new Object[] { tempRec.CustomerID, tempRec.TerritoryID, tempRec.AccountNumber, tempRec.CustomerType, tempRec.rowguid, tempRec.ModifiedDate}, true);

Image6.JPG

Conclusion

So my conclusion is that LinQ is not the overall winner (as expected!). While insert operations as better in LinQ; the read operation are great in ADO. Xml operations are by and large same (not too much of a difference) and object access is basically depending on the type of use, still aggregate was a good example. LinQ to datasets is pretty costly; I suggest using the ADO version and only use LinQ to objects if we already have dataset available and want to query.

For further improvement we should try and do bulky insert operations and also reads of different kind (like integer read, string read, block data read and substring reads). Again it is hard to draw conclusions based on the limited number of scenarios I have covered. There is scope of improvement in each scenario, but since I have timed all the operations, this will give us better insights when we design or architect next time.

Mean and Median Values

ADO Insert

LinQInsert

Mean

102445.4

Mean

40941.11

Median

97645

Median

39247

ADOInsertDirect

LinQInsertDirect

Mean

101625.8

Mean

42324.78

Median

96991

Median

37634

ADORead

LinQRead

Mean

7162452

Mean

14000818

Median

7097164

Median

13825471

csharpXMLReading

LinQXmlRead

Mean

108391.1

Mean

98842.27

Median

104622

Median

92318

csharpObjects1

LinQObjects1

Mean

461.6407

Mean

2965.475

Median

402

Median

2340

LinQObjects2

Mean

0

Median

0

ADO DataSet

LinQ DataSet

Mean

19159168

Mean

24760066

Median

19241882

Median

24597140

Download source code and sample application here

Download Source_File_and_Sample_Application.zip - 473.12 KB

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here