Click here to Skip to main content
15,904,024 members
Please Sign up or sign in to vote.
4.50/5 (2 votes)
See more:
Hello,
i am having a performance problem when trying to insert multiple rows into a table in SQL server. I am parsing lines in a txt file and inserting the lines each in a table row. The problem is that on average i have 7k plus lines in each file this is making my program very slow and not adequate. Any help ?
Posted

Build a DataTable in-memory and then do a bulk insert using SqlBulkCopy. Check out this example http://blogs.msdn.com/b/nikhilsi/archive/2008/06/11/bulk-insert-into-sql-from-c-app.aspx[^]

Edit:

Sorry, I read your comment, you have said that you are using a stored proc. Do you mean that you call a stored proc for every line you read? That's always going to be costly. See if you could move to the approach I gave above. Else make sure you are not opening / closing the database connection for every line read/ every stored proc call...
 
Share this answer
 
v3
Comments
MalikRizwan 13-Oct-10 0:28am    
I won't why this answer wasn't rated and other was rated?
One way could be to build all your queries together in a string and then make a single call to the database. That will at least reduce the performance drag of connecting to your server every single time and then performing an insert / update.
 
Share this answer
 
Comments
SuperAdministrator 12-Oct-10 2:32am    
My queries are SP's and building them in a string inside the code will cost me compile time .. since they need to be compiled.
you can use bcp to insert datat from file into your DB

http://msdn.microsoft.com/en-us/library/aa174646(SQL.80).aspx[^]
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900