Click here to Skip to main content
15,886,199 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I have few records to save into database, approx 12 KB each. They are coming as bunch of 10 to 15 Records. I have option of 2 approaches to follow.

1. Iterating Records in C# and sending them one by one to Database to save. (10 to 15 requests to DB 12 KB each).
2. Pushing all records at once in DB, and let the stored procedure do the Jobs of Iterating and saving each record. (Only one request to DB 120KB to 180KB).

This is gonna be used in a job which is configured to run everyday midnight. Please suggest which is approach is better and why?

What I have tried:

I have both code ready, only approach is to be decided.
Posted
Updated 22-Mar-16 19:49pm
Comments
PIEBALDconsult 23-Mar-16 1:41am    
In your implementation 1, I trust you are using parameters?
In your implementation 2, just how are implementing "Pushing all records at once in DB"?
And if you are using SQL Server, have you considered using a table-valued-parameter?
https://msdn.microsoft.com/en-us/library/bb510489.aspx
https://blogs.msdn.microsoft.com/sqlcat/2013/09/23/maximizing-throughput-with-tvp/
CHill60 23-Mar-16 7:03am    
"Coming" from where?
If it's another database then use a linked server (and do not iterate the records - RDBMS are set-based)
If it's from a file (text, excel etc) then use bulk loading techniques to load the entire file
You have not provided enough information to get a complete answer
PIEBALDconsult 23-Mar-16 20:04pm    
Never had good luck with linked servers.
AshishVishwakarma 24-Nov-16 5:58am    
It's a WCF Servie

1 solution

"Which is better and why ?" - there's a lot of information we dont have before we could answer that - in either case/'approach', have you thought about transactions/scope, recovery (ie what happens if the insert cant happen), do the raw transactions get stored somewhere (backup), what audit procedures do you have - eg (maybe) a daily report of what has been entered into the database and/or an alert if there's any sort of error eg cant connect to database and the transactions can be picked up later ?

What is the context of the data ? do all the records have to be inserted for the run/batch to be complete ?? or does failure of 1 insert in case 1 invalidate the entire process ???

'coming' you say ? from where/what ?? does the source system back up the data ???

Sorry, Im a picky old fart, but the minute you dont know where your data is/what state its in, you're in trouble - approach 1 or 2 may not be any different timewise, but I would have though only you can answer the questions around 'data integrity/safety' etc and they are likely more important for a seemingly small amount of data.

Last question ... what happens when your boss comes to you and says you're going from 10-15 records a night to 10,000-15,000 records ? what will you do then ??
 
Share this answer
 
v2

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900