Click here to Skip to main content
15,797,721 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I have a store procedure that loads a data to hands-on table when data is more like 7Lakh records it takes more than 40 sec to load the data, now the client is asking to reduce the load time to 10 sec ,

i have optimized my store procedure but no luck,

Now my question is there any technique to load part of the data say 1 lakh and allow the user to do some activity in the UI, in the meantime I can load the remaining records.

example multithreading,
How can i achieve it

What I have tried:

Still Working on it, tried auto mapper but no luck
Updated 5-Mar-23 20:20pm
Member 15627495 6-Mar-23 0:46am    
a ThreadPool will immediatly handle the loading of the datas, without blocking your GUI.

why don't you show you code ? maybe some rewriting is needed
murkalkiran 6-Mar-23 0:59am    
List<ratecarddata2> ratecards = new List<ratecarddata2>();

var parameters = new ObjectParameter[] { new ObjectParameter("pRateTableId", rateTableId),
new ObjectParameter("RateCardColNo", tmpRateCardColNo),
new ObjectParameter("CategoryId", tmpcategoryId)};
var result1 = _dbcontext.ObjectContext
.ExecuteFunction<sp_getratecard_data_result>("SP_GetRATECARD_DATA", parameters

var config = new MapperConfiguration(cfg =>
cfg.CreateMap<sp_getratecard_data_result, ratecarddata2="">();
IMapper iMapper = config.CreateMapper();
ratecards = iMapper.Map<list<sp_getratecard_data_result>, IEnumerable<ratecarddata2>>(result1.ToList()).ToList();
return ratecards;

This RateCards will return 7lakh records
_Asif_ 6-Mar-23 1:02am    
Why don't you look at server-side pagination? 700K rows is a big number and may cause performance issues at every level. As per industry practices for all such cases, we usually do pagination.
Member 15627495 6-Mar-23 1:53am    
you're allocating lot of vars, that is over-creating vars : A in B, then B in C , then C in D....
It's so "heavy".

every time you create a Var to fill it, you spent time.

All the Datas in those multiple container are same Datas, useless to have same Datas 2 times.

as relevant 2 lines :
ratecards = iMapper.Map<list<sp_getratecard_data_result>, IEnumerable<ratecarddata2>>(result1.ToList()).ToList();
return ratecards;

// become :
return iMapper.Map<list<sp_getratecard_data_result>, IEnumerable<ratecarddata2>>(result1.ToList()).ToList();

mainly , you're filling a List with 'a transaction bind getter' to Map it, then return the result.
try to rewrite your function, but with less vars creation.

you're using 'result1' as DB feedback, to convert it in List again.
you really can make shorter.

sure it works, but within 40 seconds.
- try to use less vars, make your best.
- watch all the 'Types and containers' used in your function, throw out at less 2, maybe 3.
murkalkiran 6-Mar-23 2:27am    
thanks will try

1 solution

Threading won't help: the operation that takes time isn't part of "your code".

Stored procedures are - obviously - executed at the server, and the problem is that you start the operation on the server and it takes time to get the result set back to you, and that has three parts:
1) Start the SP on the server.
2) Execute the SP on the server to assemble the rows you want.
3) Return the data to you.

Part 2 and to an extent part 3 is where the time is taken - part 1 is quick, normally nearly instantaneous - but 700,000 rows is a lot of data.
If each row averages out at 1,000 bytes, then you are looking at transfering the thick end of 3/4 of a gigabyte of data, which is going to take significant time all on it's own - over a CAT5 LAN you are looking at 6-10 seconds just for that, and that won't even start to occur until the SP is complete! If WiFi is involved, that time will increase, and the internet could significantly slow the whole process down.

And almost certainly, it's the SP that you need to optimise: if you assume the
worst case LAN transfer of 10 seconds, then 30 of those 40 seconds are the Server processing the SP! So look there and start seeing what it does.

But ... I'd have to wonder what the heck you are doing with 700,000 rows of data that the user is aware of it? You can't be showing it all to the user, that would be incredibly silly - so how is he noticing how long a query takes?
I'd start by looking at what you do with the data and working out if I actually needed it all in the first place, because in most cases you don't and you are wasting significant time fetching data you don't need.
Share this answer

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900