Click here to Skip to main content
15,892,809 members
Please Sign up or sign in to vote.
2.20/5 (3 votes)
See more:
Background worker does read the txt files and write it to mssql server.
I want to use one backgroundworker for multiple times.
There are lots of files and one backgroundworker is not enough for read files, its took lots of time.

Do you have any idea?

What I have tried:

nothing for special google searcing not etc.
Posted
Updated 21-Aug-17 6:43am
v2
Comments
Mehdi Gholam 21-Aug-17 9:58am    
do...while
BillWoodruff 22-Aug-17 2:07am    
If all the files being read are being written to one instance of MSSQL: I don't think performance improvement is possible. Parallel queries are possible, however.

see: https://blogs.msdn.microsoft.com/sqlcat/2006/05/19/load-1tb-in-less-than-1-hour/

Take a look at this feature, that is realy simple:

How to: Write a Simple Parallel.ForEach Loop | Microsoft Docs[^]

In this case you don't need any explicit created Worker Thread at all!
 
Share this answer
 
v2
Be careful when doing disk I/O using parallel programming, especially when using a harddisk.
Most of the time this will turn out slower than doing things sequentially !
 
Share this answer
 
A BackgroundWorker is essentially a thread - so you don't want to run one worker multiple times, you want to run multiple workers on the same code. That's easy to do: just create a new BackgroundWorker for each text file and pass each instance a different file to process. All you need for that is a loop for the files, and a copy of your existing code inside it:
C#
foreach (string file in files)
   {
   BackgroundWorker work = new BackGroundWorker();
   ...
   work.RunWorkerAsync(file);
   }
You can then pick up the file from the e.Argument property inside the DoWork event handler.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900