|
How global variables were handled - the scope of those.
|
|
|
|
|
In respect of the errors you can make not freeing memory they are, which was the part of the language we were discussing.
|
|
|
|
|
Super Lloyd wrote: But I am under the impression that the memory problem in C++ and C are related to the same problem...
That is correct. Both allow the programmer to control the scope in which an allocation lives. So if the programmer fails to control that then an actual leak is produced.
More so both also allow programmers explicit write control over memory. That makes more serious problems possible. Those are the ones that generally lead to security problems where as memory leaks (in the strict definition) most often just lead to business functionality problems.
Other languages like C# and Java can both have memory leaks (strict definition) but is less likely to occur.
|
|
|
|
|
With C# and .Net I have seen way too many cases of idiotic memory leaks, just because the developers had no concept of an object lifecycle at all. I have also seen an entire team of Java guys sitting around a server that needed a reset every day because their opus hogged memory. If anything at all, the Java Hobbits are even more religious about not managing your memory. And what could all those senior devs, architects and project leads do about the the memory leak? They beat the garbage collection into submission and fixed the symptom - for a while.
Garbage collection and all sorts of automated memory management can't replace a proper architecture and object lifecycles. At best it makes routine work a little easier and offers at best a safety net. At the same time it makes the developers ignorant and careless. The Java guys have made a religion out of it.
As long as I use C++, I simply have put debug assertions into the code that monitor what is going on on the heap and fire when there seems to be a leak. After that it's only a little detective work to identify the type of objects that are piling up and then take a good look at the lifecycle of these objects.
It's actually not so hard and you don't need any fancy features. The only thing you need is to invest a little more thought and give up your 'right' to write careless code. Then again, a little more thinking, discipline and a little less cowboy coding would also solve many other problems.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: a little more thinking, discipline and a little less cowboy coding
and we wouldnt have needed C++!
Properly written C is easy as hell to read and incredibly efficient re performance and memory usage. There is a reason the windows kernel is written in it.
|
|
|
|
|
Then again, very disciplined C code can already look very object oriented. Think of structs and a number of functions to construct, destruct and change them. They are not members of the struct yet, which could be accomplished with function pointers in the struct. The 'constructor' and 'destructor' would not be called automatically. There also would be no proper encapsulation, inheritance or polymorphism.
Still, that's essentially how C became C++. With this transition also the scope of the programs changed. On a small, limited system I prefer to use C because of the greater degree of control over the spartan resources. On a traditional 8 bit computer even that may still be too much. An 8 bit bus, only a few k memory and C are not a match made in heaven.
C++ allows me to go into the other direction. Object orientation is very much a divide and conquer strategy that allows you to tackle huge tasks. Resources, like memory, are not so limited. Still, C++ retains much of C's control over these things if you need it.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Indeed.
C++ though, because of it's complexity, allows bad programmers, and there are plenty, as we know, to write some truly appalling code.
It also allows good programmers to show off by writing unnecessarily complicated code. Re the 'my manager has asked us to dumb down our code' thread a few weeks back. IMO that is a good thing to do. KISS is always my motto.
|
|
|
|
|
Trust me, there were no shortage of bad programmers before C++...and even before C. In my experience, bad programmers dominated the industry, back when I started in the mid-70s, and still do. I thank them all. They make me look so much better by comparison
The only real change I've seen is that more capable tools/languages increase the illusion of competency for bad programmers. Way back when, they (mostly) used to know they were bad. They would hide in their cubicles, hoping no one noticed.
Regrettably, now the illusion has gotten so good that they actually think they're competent. Since they "know everything", they become arrogant and refuse to learn even basic skills. Now, they opine, at length, during meetings and write books sharing their "knowledge" of the "right way" to do things...yuck!
I had one contractor argue that iterating a list was faster than a hash table, until you reached millions of elements. At the time, I was his manager. I patiently explained that the crossover was generally quite a bit lower. He remained unconvinced. Another team mate, independently, attempted a similar explanation. No success for either of us.
He was not dissuaded even after I wrote a simple program demonstrating how wrong he was...for his specific use case. I'm certain, elsewhere, he continues to hold this same belief today.
I miss the days when bad programmers were self-aware and hid. They did so much less damage back then, between the time they were hired and finally discovered.
|
|
|
|
|
When I read you message I have a feeling that you have no idea what is the causes of memory leak in C#.... For one thing it has little to do with "object lifecyle"
FYI, unless you do interop, which is less and less common those days, the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. Or maybe too many live threads which don't die (and all the memory they collect), could typically come from wait handle hanging forever...
|
|
|
|
|
Super Lloyd wrote: For one thing it has little to do with "object lifecyle" How could it possibly? Just fire and forget, do anything with an object as you like. Why waste a thought on where an object comes from or what becomes of it? The garbage collector takes care of everything. Until it does not.
Super Lloyd wrote: the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. If that only were true. Real hacks are excellent at combining things that are not problematic by themselves in a way that leads to maximum chaos.
Would you, for example, see anything wrong with storing controls that have been added dynamically to an ASP.Net page in the session instead of simply constructing new ones when Page.Load() is called? The irony is, that it is the misguided attempt to take control of the control's lifecycles that leads to the memory leak, and not fire and forget.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Super Lloyd wrote: For one thing it has little to do with "object lifecyle"
Perhaps we have a different definition of that term. I have seen problems in both Java and C# where the developers failed to properly account for how an object 'ended'. They had no problem with when the object 'started' but they never considered how to specifically control how it 'ended'.
Matter of fact I suspect that I have seen code that might have that sort of problem just in the past week.
Super Lloyd wrote: FYI, unless you do interop, which is less and less common those days, the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. Or maybe too many live threads which don't die (and all the memory they collect), could typically come from wait handle hanging forever...
Probably been at least 15 years since I worked in a system where at least some threads did not die. Often quite a few of them. At least in Java the server frameworks build that in extensively. Most problems I have seen have been because the developers did not even seem to be aware that they were working in a threaded environment.
Java and C# both have complex thread pool support and the concept has existed since before C# existed. Might have existed before Java but I wasn't introduced to it until after I first used Java. For context I started with Java 1.1.4 (well before 1.4.)
|
|
|
|
|
ThreadPools have been around since long before java's inventor was a gleam in his grandfather's eye.
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
CodeWraith wrote: I have also seen an entire team of Java guys sitting around a server that needed a reset every day because their opus hogged memory.
I was told by a number of unix admins that when C/C++ servers were the norm that was the standard operating procedure for any large enterprise. It was something that was planned for when putting the process together for a new server application.
And it wasn't just for memory. File descriptors and sockets also needed to be cleaned up.
|
|
|
|
|
I think to remember for the first windows servers that this was a standard procedure (not daily, but about weekely). This not because of the applications/services but because windows was the problem at that time.
It does not solve my Problem, but it answers my question
modified 19-Jan-21 21:04pm.
|
|
|
|
|
That's SOP for any data center. Mostly because of the OS, not the applications.
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
I reckon @CodeWraith answered it quite well, to rephrase it another way:
c, c++, c#, <name your="" favorite="" language=""> does not have memory leaks,
poorly written applications (and libraries) do.
so do the latest features to c++ solve the issues?
No. ...please re-read the statement in bold above.
This internet thing is amazing! Letting people use it: worst idea ever!
|
|
|
|
|
To be honest, I like the "feature" that I as the programmer is responsible for each and every resource and therefore need to pay attention to it. This is a clear straight forward situation and in case I'm not able to take care about the resource management then I better do not program.
Vs. c#, where there are hidden traps ... I need to take care e.g. wether sometng is IDisponable or not and so on. Case by case, not straight forward.
Only my 2 cents.
It does not solve my Problem, but it answers my question
modified 19-Jan-21 21:04pm.
|
|
|
|
|
Yep, combo of std::unique_ptr, std::shared_ptr and std::weak_ptr are very good. I much prefer to use these than the IDispose pattern but I spend 80% time doing c++ and 20% doing c#
|
|
|
|
|
I was reading this article about how terrible JavaScript is:
The JavaScript phenomenon is a mass psychosis – Hacker Noon[^]
It got me thinking that the only reason that it stays around, to say nothing of its utter ubiquity nowadays, is because everyone uses it, and so everyone continues to use it because it would be too much of a pain to stop using it - which seems to me to be the definition of "technical debt".
|
|
|
|
|
|
Over time, curious to see what sort of adoption rate this gets. With Node.js, utilization of JavaScript seemed to be spreading instead of dying out (like it should).
The trend lately seems towards non-type-safe languages (like JavaScript and Python).
I'm hoping for a reversal in trend. Maybe WASM will help
|
|
|
|
|
|
|
In 1995, Netscape Communications recruited Brendan Eich with the goal of embedding the Scheme programming language into its Netscape Navigator.Before he could get started, Netscape Communications collaborated with Sun Microsystems to include in Netscape Navigator Sun's more static programming language Java, in order to compete with Microsoft for user adoption of Web technologies and platforms.Netscape Communications then decided that the scripting language they wanted to create would complement Java and should have a similar syntax, which excluded adopting other languages such as Perl, Python, TCL, or Scheme. To defend the idea of JavaScript against competing proposals, the company needed a prototype. Eich wrote one in 10 days, in May 1995.
Caveat Emptor.
"Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
|
|
|
|
|
Article complains about JS then mentions Java in the list of better languages... Not by much if at all. But even though TS is a temporary band-aid over the language in my opinion, framework hell isn't going anywhere
|
|
|
|