Click here to Skip to main content
15,891,136 members

Welcome to the Lounge

   

For discussing anything related to a software developer's life but is not for programming questions. Got a programming question?

The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.

 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
Dan Neely14-May-19 14:09
Dan Neely14-May-19 14:09 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch14-May-19 14:26
mvahoney the codewitch14-May-19 14:26 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
Gary R. Wheeler14-May-19 12:43
Gary R. Wheeler14-May-19 12:43 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch14-May-19 13:15
mvahoney the codewitch14-May-19 13:15 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
Member 916705714-May-19 22:04
Member 916705714-May-19 22:04 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
Eddy Vluggen15-May-19 0:58
professionalEddy Vluggen15-May-19 0:58 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch15-May-19 3:37
mvahoney the codewitch15-May-19 3:37 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
kalberts15-May-19 3:30
kalberts15-May-19 3:30 
You will never have true realtime. There are always unexpected delays, from cache misses to virtual memory paging operations. The systems getting closest to RT are the old ones: The first machine I laid my hands on was a 1976 design 16-bit mini (i.e. the class of PDP-11): It had 16 hardware interrupt levels - that is, 16 complete sets of hardware registers, so when an interrupt occured at a higher level than the one currently executing, the register bank was switched in an the first instruction executed 900 ns after the interrupt signal was received. That was certainly impressive in 1976, but the CPU was hard logic throughout, with no nasty pipelines to be emptied, no microcode loops that had to terminate, usually you had no paging or virtual memory. That got you close to RT. You won't see anything that compares today, even without GC.

In those days, people got shivers from the though of writing any system software in any high level language. Lots of people shook their heads in disbelief over Unix written in K&R C: It could never work, could never give good enough performance. I have saved a printout of a long discussion from around 1995, when NetNews was The Social Media: This one guy who insisted, at great length, that high level languages was nothing but a fad that would soon go away; they will never give high enough performance. (In 1995, he didn't get much support from the community, but he never gave in to the pressure.)

Using a high level language takes you one step away from RT. Using a machine with virtual memory is another step (even if your program is fixed in memory - the OS may be busy managing memory for other processes, with interrupts disabled). Dependency on pipelines and cache hit rates is yet another step. If you require really hard RT, you must stay away from such facilities - probably from any modern CISC CPU.

You probably do not have hard RT requirements; you can live with a cache miss, or the OS updating page table for other processes. You should design your code to be able to handle as large random delays as possible. Networking guys know the techniques, like window mechanisms and elastic buffers. If you do things the right way, I am not willing to believe that a modern multi-GHz six-core CPUs ability to run a VST plugin sufficiently close to RT is ruined by the CRT GC!

I grew up with high level languages, but knowing that RT "had to" be done in assembly. But soon I also learned how smart optimizing compilers can be, and gave up my belief in assembly. I much longer thought that I would have to manage the heap myself, if performance was essential. I switched to C#, where it isn't an option, and then came across a fairly good description of the CLR CG (in "CLR via C#"). Again and again I said to myself: Hey, that's a neat trick! or: I never though of that in my heap management! ... Twenty years earlier I concluded that a compiler is far smarter than I am in generating code. Now Irealized that CLR is far smarter than I am in managing memory. And I never had any performance problems in C#.

You sure can manage memory yourself, even in C#. As a graduate student I was a TA helping freshmen through their first programming course. The "hard" engineering disciplines still used Fortran, the rest had switched to Pascal. Exercises were common to both languages, except for one: The Pascal version was a linked list problem. One EE student was offended: Why can't we learn about this pointer stuff, like all the others? I gave her an introduction to pointers ... and next week she returned with a solution to linked list problem, written in Fortran, the heap as a big 2D array, pointers being integer indexes of the next object (i.e. array row) in the list.

You can do something similar: At program startup, allocate a large byte or integer array, cast any slice of the array into any class when referencing an object, cast from an arbitrary object to an array slice when allocating or modifying an object. As if you were writing your own memory manager in assembly language (except then you wouldn't have to do all that casting). I am sure that you could find a lot of smart strategies in the description of the CLR memory management Smile | :)

For being a little more serious: I would very much like to see a C++ solution and a C# solution side-by-side, everything being identical with the exception of the heap management. And then have a demonstration that the C# solution has untolerable hickups every time the GC makes a round of cleanup. I doubt that I will ever have that experience.

What probably would happen is that in the rewriting from C# to C++, experience from the C# work allows you to write better, more efficient code in C++ - improvements that might have been carried back to the C# version, but it isn't. So there are far more differences in the two alternatives than just the heap management.

One example showing this effect: When IBM developed its first RISC chip, the 801, at a conference they presented a paper on the speed increase made possible by the large register file of the 801, requiring new compiler optimization techniques. Then, more or less as a side remark, the presenter told that they had carried back those optimzation methods to the compiler for the 360/370 architecture, and gained a 30% speed increase ... without any extended register file. But that was only an informal experiment, so for the rest of the presentation, all performance gains on the RISC was ascribed to the RISC architecture, including the 30% which could be obtained on a quite different CISC architecture.

So if a C++ implementation with "handwritten" heap management runs more smoothly than a C# one with automatic heap management, I would suspect that it has more to do with "new optimization techniques" until I have compared the two, line by line, as well as the compiler options.
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch15-May-19 3:35
mvahoney the codewitch15-May-19 3:35 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
kalberts15-May-19 3:39
kalberts15-May-19 3:39 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch15-May-19 4:05
mvahoney the codewitch15-May-19 4:05 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
johannesnestler15-May-19 4:28
johannesnestler15-May-19 4:28 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch15-May-19 5:15
mvahoney the codewitch15-May-19 5:15 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
Gerardo Orozco15-May-19 6:32
Gerardo Orozco15-May-19 6:32 
GeneralRe: Some thoughts on the .net CLR/CLI Pin
honey the codewitch15-May-19 8:05
mvahoney the codewitch15-May-19 8:05 
Generalstuck with .NET Core Pin
honey the codewitch14-May-19 2:22
mvahoney the codewitch14-May-19 2:22 
GeneralRe: stuck with .NET Core Pin
OriginalGriff14-May-19 2:34
mveOriginalGriff14-May-19 2:34 
GeneralRe: stuck with .NET Core Pin
honey the codewitch14-May-19 2:38
mvahoney the codewitch14-May-19 2:38 
GeneralRe: stuck with .NET Core Pin
OriginalGriff14-May-19 2:53
mveOriginalGriff14-May-19 2:53 
GeneralRe: stuck with .NET Core Pin
honey the codewitch14-May-19 3:00
mvahoney the codewitch14-May-19 3:00 
GeneralRe: stuck with .NET Core Pin
Eddy Vluggen14-May-19 4:17
professionalEddy Vluggen14-May-19 4:17 
GeneralRe: stuck with .NET Core Pin
honey the codewitch14-May-19 4:22
mvahoney the codewitch14-May-19 4:22 
GeneralRe: stuck with .NET Core Pin
Sander Rossel14-May-19 9:10
professionalSander Rossel14-May-19 9:10 
GeneralRe: stuck with .NET Core Pin
RickZeeland14-May-19 2:34
mveRickZeeland14-May-19 2:34 
GeneralRe: stuck with .NET Core Pin
Kornfeld Eliyahu Peter14-May-19 2:40
professionalKornfeld Eliyahu Peter14-May-19 2:40 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.