It seems to me that major problem is not in your code, but on the server side. From your question, I probably can assume that if you do the same operations sequentially, your client works find. Please confirm it. If you did not explicitly tried it with your new code, please do it. The simplest way to try it is this: make sure your
makeRequestList
has only one element. Better yet, simply call
Execute
under the loop directly, in the calling thread. If it works slowly and gives your satisfactory results (slowly), the server side is almost certainly a problems.
Now, let's see what your threading can possibly give you. Imaging you execute several threads in parallel, each working with different remote host. In this case, the gain is apparent: the bottleneck is either in the network traffic somewhere in the middle of the network, or at the service. In all cases, your CPU usage is not a bottleneck: the client system rarely extensively use the network elsewhere, so your threads spend considerable time in wait state, waiting for more data in networks streams, thus not spending CPU resources in vain. Everything would be good.
What happens if you poll the same very service host? Well, it depends. If the service is a big Web factory with load balancing and other powerful things, your threading approach will work. But if the host is just a single host computer with the 2-4 cores, the Web service/site/application can become a bottleneck, so most of your threads would be wasted. What is your cases? what's in the request lists? If those are the requests to the same host, having all your threads might not be a solution. Such solution can make things even slower, because of the overhead of the threading itself. You might need a sequential set of request, or just a few active threads at a time, say, 2-6. You can try it out to figure out the close-to-optimum number (when your functionality is fixed, of course).
But your problem is more serious. Timeouts. If my assumption I formulated above in my first paragraph is correct, the server side is a problem. You overwhelm the host with too many request, the responses take too much time for such settings; and you get many timeouts. Not much you can do with this situation, unless you address the problem on the service. Even if you make your data downloads "more sequential", some other client may overwhelm the server part. This is not right.
Now, about some problems of your code. First, creating all threads in the second fragment of code shown is not good. You should better use the thread from thread pool, or reuse some fixed set of threads, which I personally prefer to do. A thread is sleeping at the blocking call (such as with the use of
EventWaitHandle
) and awaken when a task is ready to start. On comprehensive approach is using the
System.Collections.Concurrent.BlockingCollectio<code>n<t></t>
. See also my article where I explain all the important detail and provide interesting usage samples:
Simple Blocking Queue for Thread Communication and Inter-thread Invocation[
^].
I already appreciated your way of passing multiple parameters to a thread. However, I know even better way, with a thread wrapper. Please see my past answers:
How to pass ref parameter to the thread[
^],
Change parameters of thread (producer) after it is started[
^],
MultiThreading in C#[
^];
see also:
Running exactly one job/thread/process in webservice that will never get terminated (asp.net)[
^],
Making Code Thread Safe[
^].
I would question the general code design, where you get data from files, put to other files, many files… However, I don't know your problem well enough to be sure, maybe it makes sense.
The big problem would be the blocking of the exception propagation using
… catch { }
. There are few cases when it makes sense, mostly if you have to fix some defects in 3rd-party code which is not accessible for patching.
And, finally, I hope you understand that hard-coding any
immediate constants like your
100
,
@"SOAP:Action"
, and so on, is not good; it's much better to use explicitly declared constants (and even resources, data files, etc., in other cases).
—SA