Click here to Skip to main content
15,069,297 members
Articles / Programming Languages / C#
Posted 1 Apr 2010

Tagged as


22 bookmarked

LINQ to SQL Performance Considerations

Rate me:
Please Sign up or sign in to vote.
4.78/5 (11 votes)
1 Apr 2010CPOL6 min read
Performance considerations when using LINQ to SQL


As developers, we have a wide array of tools at our disposal. One of the most time consuming, but most beneficial tasks we must undertake is to determine which of those tools to use for a given situation.

We must weigh very carefully the trade offs to using many of these tools. Some tools provide us with an easy to use interface but at a cost of overall performance.

Some tools give us lighting speed but at a cost of code maintainability.

Is any one tool the best solution for all situations?  If you think so, please keep on reading.

We should also not base our decision on choosing a tool because it is the current programming rage in the industry or the latest and greatest cool widget from XYZ Corporation.

Recently I have been able to do some coding using LINQ to SQL. Being an old school stored procedure and SQL guy, I decided to roll up my sleeves and jump into to the O/RM (Object Relational Model) ocean as it may be.

What I found is quite interesting and has allowed me to add some new interesting tools to my programming tool box.

The Battle at Hand

Where I am currently consulting, I have been writing a manufacturing Traceability system in ASP.NET that calls a lot of stored procedures.

I have had to make some modifications to the application and decided to incorporate some LINQ to SQL into the code so I could evaluate several factors.

I wanted to see what impact writing LINQ to SQL had on the overall development and test cycle time and wanted to evaluate the impact this new code had on the performance and response time of the application.

I decided to take one stored procedure and modify the code to try out various programming situations and compare the findings. I decided to compare the original code that calls the stored procedure with various LINQ to SQL solutions and then benchmark the findings.

Here is a scaled down version of the stored procedure I was originally using:

Create PROCEDURE [dbo].[CalcFailedTest] 
	-- Add the parameters for the stored procedure here
	@StartDate datetime,
	@EndDate datetime,
	@Model varchar(2)

	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	Declare @StartDateTime as datetime
	Declare @EndStartDateTime as datetime
	Declare @EndDateTime as datetime
	Declare @StartTime varchar(12)
	Declare @EndTime varchar(12)
	Declare @Hours int
	declare @TotalCnt int
	select @StartTime = StartTime from shifts
	where shiftname = 'Shift1'
	select @EndTime = StartTime,
		@Hours = Hours from shifts
	where shiftname = 'Shift2'
	-- set the begin date and end date
	set @StartDateTime = Convert(datetime,@StartDate + @StartTime,20)
	set @EndStartDateTime = Convert(datetime,@EndDate + @EndTime,20)
	set @EndDateTime = DateAdd(hh,@Hours,@EndStartDateTime)
select @TotalCnt = COUNT(partnumber) from Inspection _
	where TestTimeStamp > @StartDateTime and TestTimeStamp < @EndDateTime
	and Model = @Model
case when TestResults = 'F' then COUNT(PartNumber) else 0 end as FailedCount,

case when TestResults = 'C' then COUNT(PartNumber) else 0 end as CancelCount,

TestResults ,FailStep, Convert(decimal(18,2),1.0*
	COUNT(PartNumber)/@TotalCnt*100) PercentFailed
from Inspection where TestTimeStamp > @StartDateTime and TestTimeStamp < @EndDateTime 
	and TestResults != 'P' and Model = @Model
	group by TestResults, FailStep
	order by FailedCount desc



Our production facility operates two shifts per day; our day runs from 03:00 AM on one day to 0:300AM the next day. So to calculate production metrics properly, we have to first determine start and end date and times.

The stored procedure returns all tests that were failed or cancelled, which test step the test was failed or cancelled at and what the percentage failed was (meaning that in some cases a tester could fail or cancel a test, for the same reason, more than one time a day.

In order to calculate Percentage Failed, we also have to know the total tests run for the given model. That is the reason for the @TotalCnt select statement.

The engineers want the failed and cancelled counts in different columns on the report so they can export to Excel and perform calculations; this is why we use the case statement.

1st Scenario

In the first scenario, I decided to use LINQ to SQL query expressions to re-create the above results. I of course used the MSLinqToSQLGenerator (Designer) to create the object model used by the below code: Here is the code:

Program p = new Program();
SQLDataDataContext sq = new SQLDataDataContext
sq.ObjectTrackingEnabled = false;

string s1StartTime = "";
string s2StartTime = "";
string sDate = "2010-03-29";

int tcount = 0;
int iHours = 0;var stopwatch = new Stopwatch();

// Get Shift1 Start Time
var shift1 =
from sft1 in sq.Shifts
where sft1.ShiftName == "Shift1"
select new { sft1.StartTime };

foreach (var sft in shift1)
s1StartTime = sft.StartTime;

// Get Shift2 Hours
var shift2 =
from sft2 in sq.Shifts
where sft2.ShiftName == "Shift2"
      select new {sft2.StartTime, sft2.Hours };

      foreach (var sft in shift2)
      	s2StartTime = sft.StartTime;
            iHours = Convert.ToInt32(sft.Hours);
// Calculate the start and end dates and times
DateTime dStartTime = Convert.ToDateTime(sDate + " " + s1StartTime);

DateTime dEndStartTime = Convert.ToDateTime(sDate + " " + s2StartTime);

DateTime dEndTime = dEndStartTime.AddHours(iHours);
// get the total model count
var icount =
from insp in sq.Inspections
where insp.TestTimeStamp > dStartTime && 
	insp.TestTimeStamp < dEndTime && insp.Model == "EP"
      group insp by insp.Model into grp
      select new { Count = grp.Count() };

foreach (var i in icount)
	tcount = i.Count;

// generate failed step results and calculate percentage failed
var unordered =
from insp in sq.Inspections
where insp.TestTimeStamp > dStartTime && 
    insp.TestTimeStamp < dEndTime && insp.Model == "EP" && insp.TestResults != "P"
group insp by new { insp.TestResults, insp.FailStep } into grp

      select new
FailedCount = (grp.Key.TestResults == "F" ?
grp.Count() : 0),
            CancelCount = (grp.Key.TestResults == "C" ?
grp.Count() : 0),
            PercentFailed = Convert.ToDecimal(1.0 * grp.Count() /
tcount * 100)

var fStep = 
from selection in unordered
orderby selection.FailedCount descending,
selection.CancelCount descending
      select selection;

Console.WriteLine("Linq - " + 

Notice that I set sq.ObjectTrackingEnabled = false. Since the application is read only, we want to turn off the update, create and delete overhead that LINQ provides in the data context.

First I get the shift 1 start time, and then I get the shift 2 start time and hours for the shift.

I then calculate the day start and end date and times.

I then query the total count for the Model being tested.

I now can query the testers that produce failed and cancelled results and calculate the percentage failed.

2nd Scenario

In the second scenario, we will examine the call to the stored procedure using ADO.NET and my data layer class.

Please refer to my previous article for information on how I implement a data layer. My previous article is called "Optimize your Data Layer for quicker Code Development".

Here is the code:

sqlServer DB = new sqlServer();
SqlConnection DBConnect = null;
SqlDataReader ESDR = null;
string[] strParms = new string[3];


strParms[0] = "2010-03-29";
strParms[1] = "2010-03-29";
strParms[2] = "EP";

DB.Connect("Production_Monitoring", ref DBConnect);
DB.GetLookUpData("CalcFailedTest", "@StartDate,@EndDate,@Model",
ref ESDR, DBConnect, strParms);

Console.WriteLine("DataLayer = " + stopwatch.ElapsedMilliseconds);

As you can see, using ADO.NET and a streamed lined data layer, there is not much code to implement. We create an instance of the data layer. We create SqlConnection and SqlDataReader objects.

I setup the strParms array with the arguments we want to pass to the stored procedure, get the connection string from the app.config file and call the stored procedure.

3rd Scenario

In this scenario, I switched back to LINQ in order to call the above stored procedure.

This way, I can compare both maintainability and performance comparisons between this and the other scenarios.

Here is the LINQ code to call a stored procedure:

Program p = new Program();
SQLDataDataContext sq = 
	new SQLDataDataContext(p.GetConnectionString("Production_Monitoring"));
sq.ObjectTrackingEnabled = false;

string sDate = "2010-03-29";
DateTime dStart = Convert.ToDateTime(sDate);
DateTime dEnd = Convert.ToDateTime(sDate);

var lsp = sq.CalcFailedTest(dStart, dEnd, "EP").ToList();


Console.WriteLine("Linq SP - " + stopwatch.ElapsedMilliseconds);

Here I create a new data context and set the ObjectTrackingEnabled property to false. I setup the date parameters and call the stored procedure.

Notice that I call the .ToList() method so all the results will be processed in order to get accurate benchmark times.

4th Scenario

In the fourth scenario, I will use LINQ to call a user-defined function. User-defined functions are more flexible than stored procedures as you can call user-defined functions from stored procedures as well as from LINQ query expressions.

Here is the code:

Program p = new Program();
SQLDataDataContext sq = new SQLDataDataContext
sq.ObjectTrackingEnabled = false;

string sDate = "2010-03-29";
DateTime dStart = Convert.ToDateTime(sDate);
DateTime dEnd = Convert.ToDateTime(sDate);

var lfn = sq.fn_CalcFailedTest(dStart, dEnd, "EP").ToList();


Console.WriteLine("Linq SP - " + stopwatch.ElapsedMilliseconds);

As you can see, there is no difference between calling a stored procedure and a stored function. But as indicated, there is much more flexibility in the type of scenarios that you can use stored functions.


Now that you have viewed all of the scenarios, we can review the performance benchmarks:

Currently in our production database, the Inspection table contains 182,000 rows. The result of the above queries returns 14 rows.

I ran these queries from a Visual Studio C# console application and benchmarked the running queries using the Stopwatch class, which is located in the System.Diagnostics namespace.

In order to get a good sample size, I ran the console application 5 times.

Here are the results:

Scenario Milliseconds Milliseconds Milliseconds Milliseconds Milliseconds
LINQ 175 171 165 165 164
Data Layer 77 77 77 77 77
LINQ SP 90 88 89 88 89
LINQ FN 100 83 83 83 100


I assumed that the performance of LINQ would be less than the data layer but what surprised me was the amount of this difference.

In complex queries that make multiple connections to the database and have to translate the LINQ code to SQL for each query appear to cause some performance issues.

The performance of calling stored procedures and stored functions from LINQ is not that much different from the data layer query.

The benefits of using LINQ to SQL really shorten up the coding process once you get past the learning curve. I am still going through this learning curve.

Code maintainability is manageable and is similar to using my data layer. But the O/RM object mappings, code safe declarative approach does provide major benefits to the job of developing code.

So it makes a lot of sense to look at which situations you want to implement query expressions to query a database and which situations you want to use LINQ to query stored procedures or user-defined functions.

The moral of the story is to use some judgment and perform some benchmarking and profiling before you choose which tool to use from your tool box.

For me, this learning experience has given a new tool to add to my tool box.

<input id=""gwProxy"" /></input /><input id=""jsProxy"" önclick=""jsCall();"" />


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

Joe Pitz
Software Developer Objetek Systems
United States United States
I am currently consulting for Denso Wireless writing manufacturing traceability systems in C# and Denso Wireless manufactures navigation systems for some of the major automotive manufactures.

The manufacturing systems that I develop interface directly to assembly line hardware that in real time collects manufacturing metrics and data
and stores this data in a SQL Server database instance.

The Traceability system also provides engineering and management real time
access to the gathered metrics and data from an internationally
accessible web site that is developed in

Comments and Discussions

GeneralNetwork Traffic Pin
rmallamace8-Apr-10 18:13
Memberrmallamace8-Apr-10 18:13 
GeneralRe: Network Traffic Pin
Joe Pitz8-Apr-10 18:27
MemberJoe Pitz8-Apr-10 18:27 
Generalinteresting Pin
Yves8-Apr-10 14:57
MemberYves8-Apr-10 14:57 
GeneralRe: interesting Pin
Joe Pitz8-Apr-10 18:13
MemberJoe Pitz8-Apr-10 18:13 
GeneralPre-compiled LinqToSQL-Queries Pin
gerd-sander5-Apr-10 22:54
Membergerd-sander5-Apr-10 22:54 
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
ronlease6-Apr-10 1:45
professionalronlease6-Apr-10 1:45 
Thanks for the article. I agree that precompiled queries would run a bit faster and I would like to see the results compared to your other tests. However, I still think data operations should be kept as close to the database as possible in situations like this.
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
Joe Pitz6-Apr-10 10:51
MemberJoe Pitz6-Apr-10 10:51 
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
Joe Pitz6-Apr-10 10:58
MemberJoe Pitz6-Apr-10 10:58 
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
gerd-sander6-Apr-10 11:31
Membergerd-sander6-Apr-10 11:31 
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
ronlease20-Apr-10 7:36
professionalronlease20-Apr-10 7:36 
GeneralRe: Pre-compiled LinqToSQL-Queries Pin
Joe Pitz6-Apr-10 10:57
MemberJoe Pitz6-Apr-10 10:57 
GeneralMicrosoft gives Knives Pin
Abi Bellamkonda5-Apr-10 21:06
MemberAbi Bellamkonda5-Apr-10 21:06 
GeneralRe: Microsoft gives Knives Pin
Andrei Ion Rînea5-Apr-10 22:47
MemberAndrei Ion Rînea5-Apr-10 22:47 
GeneralRe: Microsoft gives Knives Pin
Joe Pitz6-Apr-10 11:00
MemberJoe Pitz6-Apr-10 11:00 
GeneralRe: Microsoft gives Knives Pin
Joe Pitz6-Apr-10 11:00
MemberJoe Pitz6-Apr-10 11:00 
GeneralGreat article! Pin
Marc Leger2-Apr-10 12:33
MemberMarc Leger2-Apr-10 12:33 
GeneralRe: Great article! Pin
Joe Pitz2-Apr-10 12:57
MemberJoe Pitz2-Apr-10 12:57 
GeneralMy vote of 2 Pin
vilainchien2-Apr-10 2:49
Membervilainchien2-Apr-10 2:49 
GeneralRe: My vote of 2 Pin
Vahid_N3-Apr-10 10:58
MemberVahid_N3-Apr-10 10:58 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.