|
Sander Rossel wrote: The problem here is often that the author of the code has no clue about licenses or copyright either.
They'll create a repository on GitHub and assume it's now open source.
That however doesn't mean it isn't your problem. As I noted if you get, in writing, that the author said you can use it, then that would be sufficient.
Sander Rossel wrote: But how would they know that in the first place?
Disgruntled employee. Random overheard comment. Partner company notices and reports it. Selling a product and someone finds the code. Due diligence for a sale and buyer requires that it be settled before sale can proceed. There are probably others.
Sander Rossel wrote: And even if there is an audit, are they going to check all of our repositories, all with thousands of dependencies?
I know they have automated tools that can check for problems. I suspect they do manual audits as well. This is similar to security audits and since your company is paying for it, it doesn't matter what it costs (to them of course.)
Sander Rossel wrote: And what happens ...
Yes those are problematic scenarios. And those are ones that have come up in the past where a widely used library must pull functionality or replace it because they were doing something incorrect before.
|
|
|
|
|
I should say that I had "early retired" back when auto_ptr was the new k3wl thing. I understand that after that, there have been some other features added that addressed issues that auto_ptr did not completely fix. I wonder if the state of the language is such that there are no more issues with memory leaks - which would indicate to me that no one need bother with calling delete anymore, basically making C++ like its managed cousin C#.
Or am I missing something?
|
|
|
|
|
it is funny actually, because it is very simple to avoid memory leaks in C. You just free the memory.
Along comes an *improvement* calling itself C++, which then proceeds to dig itself into a hole, many times, adding layers of complexity, even more tortuous code, static classes which look like namespaces, unnamed namespaces where static functions used to rule, the magic bullet of exception handling!!!!, ah, but now we are leaking memory, ahaha, smart pointers!!!!!, oh, that didnt work out quite well, kill auto_ptr!!!!!!
And whats the latest attempt to put a bandage on the bandage on the bandage?
There is a reason a screw driver looks like a screw driver, and always has done, for centuries. It does the job!
Can we leave our programming language tools alone? No chance. Too many nerds thinking they can be *improved*.
|
|
|
|
|
Munchies_Matt wrote: Can we leave our programming language tools alone? No chance. Too many nerds thinking they can be *improved*. No nerd is going to touch my hex keyboard and the processor's machine code.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Too late, you just touched it yourself.
|
|
|
|
|
Did you call me a nerd? You .... nerd!
Say hello to old Elf[^].
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
And the starwars helmet....
Why do so many nerds collect star war toys? There is a guy at work, has to be 40 years old, and his desk is covered in them.
|
|
|
|
|
I don't collect Star Wars stuff. Some action figures from 1977 have survived, even after I practically sold my childhood for Elf's 4k memory board. It still works, so that was a good investment.
Actually George Lucas lost me along the way. He kept getting younger and I kept getting older. It was over for me when he came up with killer teddy bears. At the moment it's far more entertaining to watch the real nerds pull out their hair and protest against the junk Disney throws at them. If anything at all, I always was more of a Trekkie, at least until Star Trek Discovery. Here also the nerds are pulling out their hair and protesting, very much for similar reasons as the other nerds do.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Granted I haven't done much C++ (even though it was a long while ago I did tons more C).
But I am under the impression that the memory problem in C++ and C are related to the same problem...
It is just that C program often do simpler things than the more complicated C++ one. To be more specific the most difficult numerical algorithm might very well be implemented in C, but the program with the most object or bits of memory moving around are likely to be in C++.....
So it's premature to boast C superiority!
|
|
|
|
|
Super Lloyd wrote: he memory problem in C++ and C are related to the same problem.
Of course they can be, they are the same language, but the memory problems that were introduced by exception handling, which were fixed by auto pointers, which introduced their own problems, is the kind of 'the fix causes bigger problems that need a bigger fix' problem that C++ has.
Super Lloyd wrote: the program with the most object or bits of memory moving around are likely to be in C++
You should see some of the drivers I have had to write.... Seriously, you want to see data traffic at an elevated rate, heavy memory use, drivers do it. But almost always dont touch the data, they dont care what it is, and you cant do floating point math in the kernel, so no, no fancy algorithms there.
I like C++ when there are distinct objects to manage, but when it is used to control a process I find it cumbersome compared to C. (Drivers are pretty much 100% process)
|
|
|
|
|
Munchies_Matt wrote: Of course they can be, they are the same language,
They are no longer the same language. Haven't been the same for quite some time. And actually the first ANSI C++ spec made at least one change that made it different from C.
|
|
|
|
|
jschell wrote: made at least one change that made it different from C. Which was?
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
added plus plus to name to start with
|
|
|
|
|
ha ha
not
#SupportHeForShe
Government can give you nothing but what it takes from somebody else. A government big enough to give you everything you want is big enough to take everything you've got, including your freedom.-Ezra Taft Benson
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
How global variables were handled - the scope of those.
|
|
|
|
|
In respect of the errors you can make not freeing memory they are, which was the part of the language we were discussing.
|
|
|
|
|
Super Lloyd wrote: But I am under the impression that the memory problem in C++ and C are related to the same problem...
That is correct. Both allow the programmer to control the scope in which an allocation lives. So if the programmer fails to control that then an actual leak is produced.
More so both also allow programmers explicit write control over memory. That makes more serious problems possible. Those are the ones that generally lead to security problems where as memory leaks (in the strict definition) most often just lead to business functionality problems.
Other languages like C# and Java can both have memory leaks (strict definition) but is less likely to occur.
|
|
|
|
|
With C# and .Net I have seen way too many cases of idiotic memory leaks, just because the developers had no concept of an object lifecycle at all. I have also seen an entire team of Java guys sitting around a server that needed a reset every day because their opus hogged memory. If anything at all, the Java Hobbits are even more religious about not managing your memory. And what could all those senior devs, architects and project leads do about the the memory leak? They beat the garbage collection into submission and fixed the symptom - for a while.
Garbage collection and all sorts of automated memory management can't replace a proper architecture and object lifecycles. At best it makes routine work a little easier and offers at best a safety net. At the same time it makes the developers ignorant and careless. The Java guys have made a religion out of it.
As long as I use C++, I simply have put debug assertions into the code that monitor what is going on on the heap and fire when there seems to be a leak. After that it's only a little detective work to identify the type of objects that are piling up and then take a good look at the lifecycle of these objects.
It's actually not so hard and you don't need any fancy features. The only thing you need is to invest a little more thought and give up your 'right' to write careless code. Then again, a little more thinking, discipline and a little less cowboy coding would also solve many other problems.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: a little more thinking, discipline and a little less cowboy coding
and we wouldnt have needed C++!
Properly written C is easy as hell to read and incredibly efficient re performance and memory usage. There is a reason the windows kernel is written in it.
|
|
|
|
|
Then again, very disciplined C code can already look very object oriented. Think of structs and a number of functions to construct, destruct and change them. They are not members of the struct yet, which could be accomplished with function pointers in the struct. The 'constructor' and 'destructor' would not be called automatically. There also would be no proper encapsulation, inheritance or polymorphism.
Still, that's essentially how C became C++. With this transition also the scope of the programs changed. On a small, limited system I prefer to use C because of the greater degree of control over the spartan resources. On a traditional 8 bit computer even that may still be too much. An 8 bit bus, only a few k memory and C are not a match made in heaven.
C++ allows me to go into the other direction. Object orientation is very much a divide and conquer strategy that allows you to tackle huge tasks. Resources, like memory, are not so limited. Still, C++ retains much of C's control over these things if you need it.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Indeed.
C++ though, because of it's complexity, allows bad programmers, and there are plenty, as we know, to write some truly appalling code.
It also allows good programmers to show off by writing unnecessarily complicated code. Re the 'my manager has asked us to dumb down our code' thread a few weeks back. IMO that is a good thing to do. KISS is always my motto.
|
|
|
|
|
Trust me, there were no shortage of bad programmers before C++...and even before C. In my experience, bad programmers dominated the industry, back when I started in the mid-70s, and still do. I thank them all. They make me look so much better by comparison
The only real change I've seen is that more capable tools/languages increase the illusion of competency for bad programmers. Way back when, they (mostly) used to know they were bad. They would hide in their cubicles, hoping no one noticed.
Regrettably, now the illusion has gotten so good that they actually think they're competent. Since they "know everything", they become arrogant and refuse to learn even basic skills. Now, they opine, at length, during meetings and write books sharing their "knowledge" of the "right way" to do things...yuck!
I had one contractor argue that iterating a list was faster than a hash table, until you reached millions of elements. At the time, I was his manager. I patiently explained that the crossover was generally quite a bit lower. He remained unconvinced. Another team mate, independently, attempted a similar explanation. No success for either of us.
He was not dissuaded even after I wrote a simple program demonstrating how wrong he was...for his specific use case. I'm certain, elsewhere, he continues to hold this same belief today.
I miss the days when bad programmers were self-aware and hid. They did so much less damage back then, between the time they were hired and finally discovered.
|
|
|
|
|
When I read you message I have a feeling that you have no idea what is the causes of memory leak in C#.... For one thing it has little to do with "object lifecyle"
FYI, unless you do interop, which is less and less common those days, the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. Or maybe too many live threads which don't die (and all the memory they collect), could typically come from wait handle hanging forever...
|
|
|
|
|
Super Lloyd wrote: For one thing it has little to do with "object lifecyle" How could it possibly? Just fire and forget, do anything with an object as you like. Why waste a thought on where an object comes from or what becomes of it? The garbage collector takes care of everything. Until it does not.
Super Lloyd wrote: the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. If that only were true. Real hacks are excellent at combining things that are not problematic by themselves in a way that leads to maximum chaos.
Would you, for example, see anything wrong with storing controls that have been added dynamically to an ASP.Net page in the session instead of simply constructing new ones when Page.Load() is called? The irony is, that it is the misguided attempt to take control of the control's lifecycles that leads to the memory leak, and not fire and forget.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Super Lloyd wrote: For one thing it has little to do with "object lifecyle"
Perhaps we have a different definition of that term. I have seen problems in both Java and C# where the developers failed to properly account for how an object 'ended'. They had no problem with when the object 'started' but they never considered how to specifically control how it 'ended'.
Matter of fact I suspect that I have seen code that might have that sort of problem just in the past week.
Super Lloyd wrote: FYI, unless you do interop, which is less and less common those days, the only reason for leak in C# is if your referenced by static variable or (though is the same thing in disguise) static event. Or maybe too many live threads which don't die (and all the memory they collect), could typically come from wait handle hanging forever...
Probably been at least 15 years since I worked in a system where at least some threads did not die. Often quite a few of them. At least in Java the server frameworks build that in extensively. Most problems I have seen have been because the developers did not even seem to be aware that they were working in a threaded environment.
Java and C# both have complex thread pool support and the concept has existed since before C# existed. Might have existed before Java but I wasn't introduced to it until after I first used Java. For context I started with Java 1.1.4 (well before 1.4.)
|
|
|
|
|