Click here to Skip to main content
15,073,344 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I have the following loop in my program :

dtPertdata is a DataTable.

for (int a = 0; a < dtPertdata.Rows.Count; a++)
{
	DataRow dr2 = dtPertdata.Rows[a];
	if (a >= 10)
	{
		DataRow dr1 = dtPertdata.Rows[a - 10];
		double rt2 = Convert.ToDouble(dr2["value"]);
		double rt1 = Convert.ToDouble(dr1["value"]);

		double shock10 = rt2 - rt1;
		double pertYield = shock10 + Convert.ToDouble(dr2["cur_yield"]);

		dr2["shock10"] = shock10.ToString();
		dr2["pertYield"] = pertYield.ToString();		
	}	
}
dtPertdata.AcceptChanges();



Please suggest how to do this operation without using loop, for eg using linq or some other method.
Posted
Updated 20-May-15 2:56am
v6
Comments
Tomas Takac 20-May-15 9:12am
   
Why? What's wrong with the loop?
Code For You 20-May-15 9:21am
   
How many rows will be there in dtPertdata table on given point of time.
pparya27 20-May-15 11:03am
   
30 thousand rows at a time. :(

There is nothing to optimize. Linq is not inherently more performant than a "conventional" approach. It's just often more compact, faster to write and easier to read. Theoretically it would be possible to write your code in some extension method syntax but it would not be faster because it would just do the same loop "under the hoods". And in this case (assingning new values to an existing object) it would not be easier to read and also barely more compact. Your code is fine (if it works like it should).
   
v2
Comments
pparya27 20-May-15 11:10am
   
Sascha Lefèvre thanks for the reply, this code is working fine. Actually there will be some 30 thousand records in dtPertdata DataTable, so this loops takes a few minutes to run. I'am now planning to move this logic into a stored procedure in oracle, I think that will take less time... will give it a try !
Sascha Lefèvre 20-May-15 11:19am
   
You're welcome! :)

What you could do (but that's not related to Linq) is to parallelize it: Break it into batches of maybe 5000 rows each and execute this loop on those batches in parallel.
But the stored procedure-approach will probably also make it a lot faster.
pparya27 22-May-15 12:18pm
   
Hie Sascha Lefèvre, I actually managed to do this even without stored procedure, with a simple merge statement in oracle. I have took an additional column "srno" in table, indicating the row number. Here the merge statement used :

merge into dtpertdata a using dtpertdata b
on ( a.srno>=11 and b.srno= a.srno-10 )
when matched then
update set a.shocks10= (a.value - b.value), a.pertyield= a.cur_yield + (a.value - b.value)
;

and it runs in just a fraction of second !!
Sascha Lefèvre 22-May-15 12:40pm
   
Thank you for your feedback! Glad you found a good solution for this - cheers! :)
As already noted by others using Linq would not be faster. It only hides the loops that it uses internally.

However, there is one optimisation:
Just start your loop at index 10.
   
Comments
pparya27 20-May-15 11:04am
   
yes I've done that, actually I raised this question before doing that optimization.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)




CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900