Click here to Skip to main content
15,892,059 members
Please Sign up or sign in to vote.
5.00/5 (1 vote)
See more:
I have A LOT of calculations to perform so I wrote a quick program to do them for me.

For some reason when I can't seem to use all of my CPU, it seems like the system is throttling it. I wrote a simple application to demonstrate this behavior. See the below code:

C#
Parallel.For(0, 10000, x =>
{
    int i = 0;
    while (true)
    {
        i++;
    }
});


Running the above code in a console app, it maxes out all of my processors and my CPU utilization is at 100% (as I would expect).

In my calculation, I am creating several Lists and arrays so I added that to the above code and that is when I saw it behave just as my actual program was (where it would use less than 100% of the CPU). See example code below:

C#
Parallel.For(0, 10000, x =>
{
    int i = 0;
    while (true)
    {
        i++;
        List<int> vs = new List<int>();
    }
});


On my computer, which has 12 logical processors, the CPU utilization only reaches 70-80%. When running on a higher-performance PC I have, which has 32 logical processors, the behavior is even more dramatic - the highest the CPU utilization goes is 50%. And my real application that is performing the calculations only uses 20% of the total CPUs.

And as a follow-up, in Task Manager I see that it is using all logical cores more or less evenly - it isn't maxing out some and others sitting idle.

I am curious why this is happening, and more importantly, how can I make my program use all of the processor?

What I have tried:

I tried monitoring garbage collection but that didn't offer any clues (and also wouldn't make sense because I would expect garbage collection to show up as CPU usage).

I tried increasing the process priority from normal to "High" and even "Real-Time" but that didn't change anything.
Posted
Updated 12-Nov-21 10:36am
Comments
[no name] 12-Nov-21 13:10pm    
Maybe it's a problem with the "performance monitor"; how is anyone supposed to know? Or care?
MrGlass3 12-Nov-21 13:15pm    
When I run in with Visual Studio debugger attached the CPU utilization matches what is shown in the Task Manager.
And I care because I would much prefer to use near 100% of my CPU instead of 20%. It is literally the difference between the calculations taking 5 days vs over a month to complete.
PIEBALDconsult 12-Nov-21 13:13pm    
Allocating RAM is not CPU-intensive maybe?
Avoid the allocations if possible.
Use allocate-and-clear instead.
MrGlass3 12-Nov-21 13:21pm    
Something relating to memory allocation is all I could come up with (which seems wrong...I have always thought that was ignorable fast?). But I would've thought that would be a system as a whole bottleneck if that were the case. But I can launch multiple instances of this application and it eventually will begin to use 100% of the CPU. If allocation were a bottleneck I would expect multiple instances would slow down the already running processes because they would have to be waiting for their turn to allocate memory?
PIEBALDconsult 12-Nov-21 13:32pm    
It depends on several factors of course.
Allocation is definitely not "ignorable fast".
And the larger the space required, the more effort is required to do it. And if you do it, then throw it all away, just to do it again, is very wasteful.
What size Lists are we talking about? They begin (I think) at sixteen slots, then double their allocation as required. Just knowing what capacity you need up front can save a lot of time/effort. But definitely try no to throw away a List (or similar, including StringBuilder) just to make another. Empty it and reuse it.

1 solution

OK, so you want to see your CPU beg for mercy using a Parallel.For?

Here you go:
C#
public void TestMethod()
{
    ParallelLoopResult result = Parallel.For(0, 50, x =>
    {
        Random random = new Random();
        double result = 1.0;

        for (int i = 1; i < 1000000000; i++)
        {
            result += random.NextDouble();
        }

        Console.WriteLine($"{result}");
    });
}
 
Share this answer
 
Comments
MrGlass3 12-Nov-21 16:45pm    
I don't want to use 100% of the processor just to see it scream. I have millions of calculations to perform that and want them to be run as fast as possible. Apparently, it is sitting idle 80% of the time and I want to make it run at or close to 100% so it finishes in days instead of weeks.

The example code I wrote was just for a very simplistic demonstration of what I am observing.
Dave Kreskowiak 12-Nov-21 16:49pm    
OK, so what you are seeing is the CPU time of YOUR CODE, not where execution of your code has wandered into .NET CLR or Windows kernel. If you're doing a lot of memory alloc/free, that CPU time isn't going to show up in the CPU usage of your process.
MrGlass3 12-Nov-21 16:52pm    
Whether it is my code or some underlying windows process related to memory allocation, wouldn't it show up in Task Manager CPU usage? I'm not just looking at the CPU usage of my process, I'm looking at the CPU usage of the whole system.

And the other thing that I can't get to make sense, is I can launch a second instance of the application and they both run just as fast as if there was only one running so the second instance isn't slowing the first one down at all.
Dave Kreskowiak 12-Nov-21 16:59pm    
No. At the bottom of the Performance tab in Task Manager window, click the Open Resource Monitor link.

Click on the CPU line in the window that pops up. On the right at the tip, you'll see the CPU usage graph with 2 lines in it. The blue graph is user mode CPU usage, while the green one is kernel.

Your code will run as fast as Windows will allow it to. Windows is a shared system, so your code will not be allowed to hog the entire CPU. Windows has thousands of threads running, and your code will only account for a few of them.


MrGlass3 12-Nov-21 17:53pm    
Interesting. Learning something new here!

still confused though, when I do that the blue graph (Maximum frequency) is at 100% (it's actually at 150%) when I'm not running that program. Then when I launch it the Maximum Frequency drops to 135% and CPU usage goes to 20%. Then I launch a second instance and the frequency drops to 115% and the CPU usage goes to 40%. Does that behavior make sense to you?

And clearly it is capable of doing more things than it is doing (since I can launch a second instance without it slowing down). So is there a way I can configure or have my application run so that windows allows it more CPU time/resource?

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

  Print Answers RSS
Top Experts
Last 24hrsThis month


CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900