The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
Better is subjective, but the thing is that being better is not really enough anyway. You have to have a corpus of available and experienced developers who know the language and others who don't but yet who are willing to invest significant parts of their career development on it, which it might ultimately be of no value to them relative to learning other things. Chicken and egg and all that.
Some folks would argue that Rust is a better language. From my semi-gross level scan I don't agree, at least as a very general purpose language, but some people obviously do think so. But will it ever be more than a niche language? The odds are against it, and it probably has more advantages than most new languages by far (being backed by a large organization that's not seen as having greedy or insular motivations.)
And how many developers out there right now are experts at Rust if you wanted to hire up and start a big project? It seems to me that Rust could remain caught between Java/C# on one side and C++ on the other, without there ever being a big enough incentive for large numbers of people from either camp to move to the middle.
These days it seems that there's almost no friction when it comes to introducing yet another UI framework or module manager or web app framework, but huge friction on the big ticket items.
These days it seems that there's almost no friction when it comes to introducing yet another UI framework or module manager or web app framework,...
Dean Roddey wrote:
...but huge friction on the big ticket items.
I guess I'm part of the problem - C/C++ is all I got, and I probably use/understand a tenth of it. But it's got C, and that, to my feeble mind, keeps it close to the bits that sometimes get overclocked. I like that.
Can't presume to add anything to the 'what we need' discussion, save to say less might be more.
I tend to agree with you on your assessments, though I am not all that familiar with the C++ language in depth.
many years ago when dinosaurs were still considered big pets, I met a senior C++ engineer and had a very nice discussion with him. He had been coding in C++ for over 25 years and he told that the majority of issues with C\C++ development come from the fact that the majority of developers using this language really did not understand the language in depth; hence the many issues with C++ applications.
He went on to say that do quality C\C++ development you really have to spend a lot of time understanding how the internals work...
Sr. Software Engineer
Black Falcon Software, Inc.
That can be true. Of course you don't have to use any of the 'modern' bits if you don't want, at least you don't if you are willing to roll your own. If you use third party libraries you are forced to use whatever hodgepodge of features that the libraries you use choose to implement, to some degree anyway.
But C++ had all the capabilities you really NEEDED a decade plus ago. There have been some useful things that are not burdensome added since then, and I make use of those, but it's not like you couldn't do really high quality, large scale code with C++ in the 2000s.
To me, things that increase compile time safety (where that doesn't mean over-templating) are all useful things. Override, method default, method delete, and [nodiscard] are simple to use, don't introduce overhead or complexity, and allow the compiler to watch your back day after day.
Lambdas if not abused can get rid of a class of complexities because of their ability to be capturing. But, you can't pass capturing lambdas to function pointer parameters. So you are forced to use generic templated parameters, which gets you into the massive silliness that is so much part of modern C++, the 'one error generates a million barely comprehensible errors' thing.
Stuff like RAII and smart pointers (which I call janitors because the concept really goes way beyond RAII) has always been around. One thing that amuses me is how 'modern C++' people somehow think putting everything in a smart pointer is somehow magically making their code safe. Often it is just moving the dangers to other realms, which are just as hard to see (maybe more so sometimes), and just as silently deadly (maybe even more so sometimes.)
I agree. At my last job where many people so embraced the "Modern" paradigm that they would rewrite stuff to use the latest language features, things went from code reviews requiring clear, readable code to the standard being "just cut and paste this templated blob and don't worry about how it works".
Someone literally spent a couple of week turning a three line function call including lambda callback into a two line templated mess and were pleased with themselves it was shorter now.
and leaving C++ in a situation where even now you can't write even a modest practical application without third party libraries.
Not quite sure what you mean by "modest practical" but I haven't written any application in over 25 years (at least) in any language without relying on third party libraries.
Wouldn't even agree to that unless who ever was paying the bills agreed to at a minimum a much larger project timeline. And I would use that extra time to re-engineer existing libraries very likely using those third party library API definitions to replicate them.
Certainly 25 years ago I can remember creating my own logging library and implementing a testing framework as two examples of libraries that I consider essential now. I see no point in re-implementing those, especially given that I know the difficulties I had getting just those right then.
I write huge applications without third party libraries. If I could to it by myself, then clearly the collective C++ community could have (in all these years) managed to get standard and portable and reasonable subsets of at least a large core set of commonly required functionality into the language itself, in order to be more competitive with newer languages like Java and C#.
collective C++ community could have (in all these years) managed to get standard and portable and reasonable subsets of at least a large core set of commonly required functionality into the language itself
Rather certain that the C++ community specifically does not want to do that.
There was a magazine called the 'C++ Users Journal' which had at least one columnist and perhaps two that were active participants (and perhaps chairs) on the ANSI C++ committee. From what they wrote over years, as I recall, there were specific attempts to move additional libraries into the core language and those were rejected basically unanimously. Even getting templates in there and the template libraries was a fight.
Yes and no. The biggest problem by far is C++'s compatibility with C and modern C++'s compatibility with ancient C++. Meaning if you look for tutorials or ask in forums, you may and very much will come across information from the days of old, when C++ had all the disadvantages of low-level C and high-level-languages combined without any advantages. Well, this sentence is somewhat exaggerated, but the point stands: There's too much reading material on C++ and too many C++ programmers stuck in the past. To take advantage of modern C++, you need to understand when you're facing old C++ and avoid that.
That said, modern C++ itself isn't quite as easy to use as Python as you still have the static typing system, but once you learn to use it properly, it's a) actually darn easy to use (and you can kill a huge lot of difficulties by typing everything as auto) and b) the compiler catches tons of errors due to said static typing and the overall more static nature of the language.
Short: It's more complicated to quickly prototype in but the investment pays back huge when you build complex software that needs to bloody hell run.
Still, the overhead of avoiding all the legacy crap is rather substantial. I dearly wish the C++ committee came up with a modern mode. Let's say, unless a code file contains a #pragma(IAmStuckInThePast), every non-modern construct for which there's a modern replacement is a compiler error.
I think you've hit the nail on the head - the vast baggage that comes along with C++'s attempts to retain (at least superficially) compatibility with the past iterations makes it both incredibly difficult to ensure you are up to date and using the 'correct' constructs, and also means - unless you can avoid using/calling legacy code etc - that there are so many possible ways of doing things that it has become 'too difficult' to use unless you are immersed.
I gave up developing in C++ for the most part when I realised it was taking me more time and effort to understand and use correctly the various constructs that made the language most effective than it was to solve the generally non-time critical problems I was working on. Obviously others will have different experiences and hence viewpoints, I'm not saying mine is the only one.
To my mind C++ has effectively evolved into a new language, so much so that someone coming to it from new is probably in a much better place than someone like me who started with assembly language and has moved through C, C++ etc over the years. I think it is past time really for the latest iteration of C++ to drop all the 'legacy/compatability' stuff and stride out as a new language without all the baggage.
This assumes that you believe that all of the modern stuff is actually better, which plenty of folks don't. Some of it is clearly useful, but some is very much a matter of opinion. And of course you have to distinguish between the language and the library. A lot of the stuff that most anyone writing new code wouldn't want to use is the old library stuff, while a lot of thew new language stuff is much more debatable as to whether it's better or just different, or whether any advantage is does have is outweighed by different problems it introduces.
Let's use an example then. Is it really up to debate/belief, that a static_cast<T> is better, than it's counterpart, the classic cast? The cast where the compiler can check if the cast makes sense and tell me if it doesn't really isn't better than a simple "Shut up and shove the memory", is it? Of course a static_cast isn't always applicable, but then there's dynamic_cast<T> and the rest of the family and while one of the members ain't any better than the cast of olden, it doesn't have to be used in every case where you need a cast. With the old system, there's only one option, the rather dangerous one. Another example, MMA. Manual memory allocation. I simply don't believe you that there's advantages to it when a smart pointer or a library container does the job as well. Well, when talking to a memory-mapped device, static memory is all there is, laying a library container on top will only screw things up because I don't control the memory layout anymore. But that's an edge case, when it comes to business logic, letting the compiler/library do the job is in far most cases superior. Yet the oldtimers insist on doing things the old way no matter whether it's a better idea or not.
The issue I am talking about is that Oldtimers don't bother distinguish between cases, they go with old-and-tried (and difficult to use and dangerous) for the sake of it, not because it's a better idea. And really, unless you write low-level-code (which, let me just put that as a claim, Python programmers usually don't do), the new ways of more abstraction are superior.
But so much allocate of memory is within a class already. There's nothing really gained by wrapping those things in a smart pointer. It's just more moving parts and syntax and generated code. The memory is already owned and managed by the object that allocated it, and of course deleting those things in the destructor allows you to catch any errors and log them for in the field diagnosis, which you can't do if you are just letting a destructing member delete the memory.
If you are passing allocated objects around, then of course it makes a lot more sense, but that's nothing new. It's been going on for a long time and isn't modern per se.
I certainly agree that stuff like override, member delete, member default, static_cast and such are significant improvements and they don't bring lots of baggage with them that you don't want, unlike a lot of other new stuff. Things like that, which improve type safety and the ability to express semantics are always a good thing in my opinion. I can't believe that we still don't have the option for explicit parameter direction indication.
But that's the problem, you didn't say explicitly what it should be. If the right side gets accidentally changed, nothing is going to complain if what it got changed to supports the same interface (not terribly uncommon in the modern world if lots and lots of operator driven stuff.)
If you explicitly say what it's supposed to be, then two things have to get simultaneous broken in the same way. If you don't, then only one has to get broken for potential silent errors.
If that was the case I wouldn't use auto . The situation you described is actually a feature of the auto keyword. It's pretty useful to only change the initializer without have to change the type declaration during refactoring.
Last Visit: 23-Aug-19 0:49 Last Update: 23-Aug-19 0:49