Click here to Skip to main content
15,881,172 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi Guys

I would like to know what would be the best way to handle imports of csv files, with large rows of data(between 200 000 - 1 200 000 records).

Initially when i built the module the files went up to around 20 000 records.

I am currently using textfield passer and then while reading each record add the record to the object and add the object to the database.

Issue is when there is larger imports like now imports are taking over 15 - 20 min to import. which i now need to optimize

I am using an oralce database

What I have tried:

What i have now tried is reading the records into a datatable and then trying a bulk insert of the datatable, Just would like to know if there is a better way to handle the bulk imports
Posted
Updated 3-Oct-16 0:19am
Comments
Mehdi Gholam 3-Oct-16 3:42am    
Show your code.
Sinisa Hajnal 3-Oct-16 4:26am    
You could try to split the file into several related files and run import threads in parallel?

1 solution

Look at my recent article. It should be quite simple to convert to VB.Net.

CSV File Parser[^]

You could also just put it into a DLL assembly and call it from your VB app.

As far as bulk inserts go, I would write a stored procedure and submit the records one at a time. That would be faster.
 
Share this answer
 
v2

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900