The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
In the day VB6 was great for building quick desktop apps, but when .net was released, those developers should have jumped ship and started migrating all that legacy code when there was still a upgrade wizard to 90% move them to .net 1.0
but they didn't, or were not allowed to by their companies; if the latter case they should have left. I saw the writing on the wall in the beta previews, and got busy learning it.
Those who are still furiously hanging on to VB6 likely should be mocked, it's 20 years out of date and should never be touched again. It would be like people still developing in FoxDB, VBA, or J++, it's a dead platform.
Oddly enough I don't feel the same about COBAL, technically it's still has an active 'platform', and much of our infrastructure is built on it. The same for C, it's going to stay active in the embedded industry for the long term.
I'd argue that C is useful for writing *new* code as opposed to COBOL. For such an old language it has weathered the test of time - something impressive for anything computer based. C was just designed well. Given C++ is more advanced, but if you're coding in C++ the way it was designed to be used the binaries will almost always be much larger than C binaries due to the use of templates/generic programming, meaning C is still the order of the day for small machines and probably will be for the foreseeable future. As such I don't think it's exactly comparable to COBOL.
You sure can't beat C in the embedded area, lots of those chips are still 8bit with 1k memory or less. The bigger 32bit RISK chips, sure use C++ if it helps keep you organized. I've found the less abstraction layers to the IO, the closer to real time you get. There have been a few new languages biting at C toes like Rust or D, and they've ventured in to the embedded arena a bit; should be interesting to watch.
My only reason to mention COBAL is IBM is still producing those mainframes mostly to keep those systems running. The COVID-19 crisis shows there is still a need for COBAL developers to extend or modify and maintain these legacy systems. Now I wouldn't suggest a newly released CS student take that direction as a career, since eventually they will be phased out sometime in my life time I would guess/hope.
The fact that you can use C++ with so many different paradigms (OOP, generic programming, procedural, even functional programming these days) leads to a lot of misuse, but its power in that regard is amazing. You can use it to do domain-specific-language style coding. In terms of this flexibility, it's unmatched.
The problem is, C++ isn't taught well, so it often isn't used well. It's not OO primarily. It's power comes primarily from templates so generic programming is the order of the day. A $20 book called Accelerated C++ is better at teaching C++ than all the courses one took to get that shiny lil CS degree.
I've seen so much OO C++ code in my time it's just silly. MFC comes to mind.
F...P...C...B...P...A... first letters of all capitalized words...hmm...???
I searched for the hidden code that had the sarcastic remark that would reveal
that you actually meant that
we __must, of course, keep making fun of of most programming languages__
but I couldn't find it.
Can you please point out the anagram, backronym, encoded message, enigma so I can partake of your sarcasm?
Cuz, if you're a _REAL PROGRAMMER_ then there must be hidden sarcasm there somewhere.