Click here to Skip to main content
Click here to Skip to main content

Thread Basics and the CLR's Thread Pool

, 8 Jan 2009
Rate this:
Please Sign up or sign in to vote.
An article meant to describe threading basics and the logic behind a thread pool.

Introduction

One of the problems that Microsoft IT people found about working threads of execution in the Windows Operating System is that while microprocessor speeds have increased dramatically, the same clock interrupt timer was used by default. This was a problem for the time accounting for threads, and how much CPU usage they require. A clock interval timer would fire every 15 ms. At that point, the actual thread in execution would have been charged with the entire 15 ms. In reality, one thread would have been awakened to do some work, be sent back into the sleep state to have another thread awakened to do some work, to then be put back to sleep. Then, a third thread would begin to execute, but be charged for the entire 15 milliseconds because the clock interrupt timer had fired. The kernel changes in Vista SP1 seek to rectify this lack of accounting for threads “that fly under the radar”.

While the success of an application may depend on it being broken down into multithreaded parts, creating and destroying a thread is expensive. Having lots of threads wastes memory resources, and thus hurts performance because the Operating System has to schedule and context switch between runnable threads. To improve this situation, the CLR contains code to manage its own thread pool. Now, while many applications use multiple threads, often those threads spend a lot of time in the sleeping state, waiting for an event to occur. Other threads might enter a sleeping state, and be awakened only periodically to poll for a change or update status information before going to sleep again. Using thread pooling provides your application with a pool of worker threads that are managed by the system, which allows one to concentrate on application tasks rather than thread management. Actually, if you have a number of short tasks that require you to use more than one thread, using the ThreadPool class is the most effective way to use multiple threads. One thread monitors the status of several wait operations queued to the thread pool. When a wait operation completes, a worker thread from the thread pool executes the corresponding callback function.

You can also queue work items that are not related to a wait operation to the thread pool. To request that a work item be handled by a thread in the thread pool, call the QueueUserWorkerItem method. This method takes as a parameter a reference to the method or delegate that will be called by the thread selected from the thread pool. There is no way to cancel a work item after it has been queued. Timer-queue timers and registered wait operations also use the thread pool. Their callback functions are queued to the thread pool. This use of delegates and callbacks indicates the strength of the C# language and its design after the C/C++ languages. The following section will briefly touch on some basics, before we create an instance of the thread pool class.

The Simplest Scenario: Creating a Thread

There are a number of steps to follow to create a new thread:

  1. Create a method that takes no arguments and returns no data.
  2. Create a new ThreadStart delegate, and specify the method in step 1.
  3. Create a new Thread object, specifying the ThreadStart object created in step 2.
  4. Call Start() to begin execution of the new thread.
using System;
using System.Threading;

class Test {
  static void Main() {
    ThreadStart operation = new ThreadStart(DoWork);
    Thread myThread = new Thread(operation);
    myThread.Start();
  }
  static void  DoWork()  {
    Console.WriteLine(“Thread: {0}”,  
       Thread.CurrentThread.ManagedThreadId);
  }
}

When the Start method is called, the DoWork() method is called on a new thread and the thread executes until the method completes. The DoWork method writes the phrase “In Thread#” and shows the ManagedThreadId property. To create multiple threads, use the for loop (in this simple scenario), which by iterating creates, but does not start, a new thread:

using System;
using System.Threading;

class Test {
  static void Main() {
    ThreadStart operation = new ThreadStart(DoWork);
    for (int x = 1; x <= 5; ++x)
    {
      Thread myThread = new Thread(operation);
      myThread.Start();
    }
  }

  static void DoWork() {
    Console.WriteLine("Thread:  {0}", 
       Thread.CurrentThread.ManagedThreadId);
  }
}

The ThreadPool

The thread pool is created the first time you create an instance of the ThreadPool class. The thread pool has a default limit of 25 threads per available processor, which could be changed using CorSetMaxThreads as defined in the mscoree.h file. Each thread uses the default stack size, and runs at the default priority. Each process can have only one Operating System thread pool, and there is only one ThreadPool object per process. When the CLR initializes, the thread pool has no threads in it. Internally, the thread pool maintains a queue of operation requests. When an application performs an asynchronous operation, you call some method that appends an entry into the thread pool’s queue. The thread pool is created the first time you call ThreadPool.QueueUserWorkItem, or when a timer or registered wait operation queues a callback method. One thread monitors all tasks that have been queued to the thread pool. When a task has completed, a thread from the thread pool executes the corresponding callback method. There is no way to cancel a work item after it has been queued.

The number of operations that can be queued to the thread pool is limited only by the available memory; however, the thread pool will enforce a limit on the number of threads it allows to be active in the process simultaneously.

Each thread uses the default stack size, runs at the default priority, and is in the multithreaded apartment. If one of the threads becomes idle (as when waiting on an event) in managed code, the thread pool injects another worker thread to keep all the processors busy. If all thread pool threads are constantly busy, but there is pending work in the queue, the thread pool will, after some period of time, create another worker thread. However, the number of threads will never exceed the maximum value. The ThreadPool also switches to the correct AppDomain when executing thread pool callbacks.

There are several scenarios in which it is appropriate to create and manage your own threads instead of using the ThreadPool:

  • If you require a task to be prioritized.
  • If you have a task that might run a long time (and therefore block other tasks).
  • If you need to place threads into a single-threaded apartment (all ThreadPool threads are in the multithreaded apartment).
  • If you need to have a stable identity associated with the thread. For example, you might want to use a dedicated thread to abort that thread, suspend it, or discover it by name.

This first example queues a very simple task, represented by the ThreadProc method, using QueueUserWorkItem:

using System;
using System.Threading;
public class Test {
    public static void Main() {
        // Queue the task.
        ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc));
        
        Console.WriteLine("Main thread does some work, then sleeps.");
        // If you comment out the Sleep, the main thread exits before
        // the thread pool task runs. The thread pool uses background
        // threads, which do not keep the application running. (This
        // is a simple example of a race condition.)
        Thread.Sleep(1000);

        Console.WriteLine("Main thread exits.");
    }

    // This thread procedure performs the task.
    static void ThreadProc(Object stateInfo) {
        // No state object was passed to QueueUserWorkItem, so 
        // stateInfo is null.
        Console.WriteLine("Hello from the thread pool.");
    }
}

One of the most predominant uses of the thread pool is to perform an asynchronous compute-bound operation. A compute bound operation is an operation that requires computation. For instance, recalculating cells in a spreadsheet application, or spell-checking words in a word-processing application. Both involve calculation of some length of side or string, as well as adherence to grammar or format. Ideally, compute bound operations will not perform any synchronous I/O operations because synchronous I/O operations “suspend" the calling thread while the underlying hardware does the work. A thread that is suspended and not running is a thread using system resources.

References

  • C# via the CLR by Jeffrey Richter.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

About the Author

logicchild
Other Pref. Trust
United States United States
I started electronics training at age 33. I began studying microprocessor technology in an RF communications oriented program. I am 43 years old now. I have studied C code, opcode (mainly x86 and AT+T) for around 3 years in order to learn how to recognize viral code and the use of procedural languages. I am currently learning C# and the other virtual runtime system languages. I guess I started with the egg rather than the chicken. My past work would indicate that my primary strength is in applied mathematics.

Comments and Discussions

 
GeneralIt's about time PinmemberPIEBALDconsult9-Jan-09 15:22 
GeneralRe: It's about time Pinmemberlogicchild15-Jan-09 10:16 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

| Advertise | Privacy | Mobile
Web02 | 2.8.140721.1 | Last Updated 8 Jan 2009
Article Copyright 2009 by logicchild
Everything else Copyright © CodeProject, 1999-2014
Terms of Service
Layout: fixed | fluid