|
|
|
|
Depending on the day, either Sledgehammer (Peter Gabriel) or King of Pain (The Police)
|
|
|
|
|
"I've always been crazy, but it's kept me from going insane."
|
|
|
|
|
Waylon-F*ucking-Jennings FTW
(And to all the normals here, yes - that's the right way to say his name.)
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
|
It's a toss-up between "Oh Lord It's Hard to Be Humble" (Mac Davis), and "Take This Job and Shove It" (Johnny Paycheck)...
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
|
Ticking away the moments that make up a dull day
Fritter and waste the hours in an offhand way.
|
|
|
|
|
I been havin' a think on this one.
It's gotta be Springsteen - Dancin' in the Dark.
|
|
|
|
|
|
As an incredibly enthusiastic user of C++ from its very first release, I've expressed this (personal) opinion a number of times over recent years: C++ has lost its way - it now takes more time, learning, effort and skill to make good, efficient use of C++ than it takes to solve the problems one is using it for.
A caveat: I am thinking specifically of business tasks and related domains - I accept that for the most cutting edge stuff near to the metal it still offers the one of the best overall effort/performance ratios.
In part, the effort to maintain backwards compatability at almost any cost (despite the fiasco of i(o)streams and manipulators between versions 1,2 and 3) whilst adding ever more features adds huge amounts of technical debt that then has to be fought against in other ways.
This article I think demonstrates this nicely Speeding Up C++ Build Times | Figma Blog[^]
Discuss! (ducks for cover...)
|
|
|
|
|
Quote: However, there are other common solutions that we also employed to reduce build times, including local caching, remote caching, and precompiled headers. From the rest of the writing, it seems they haven't implemented precompiled headers correctly, because all their efforts to eliminate redundant headers amount to very little savings if the precompiled header gets each precompilation unit added once and then not again with additional redeclarations. At least that is the way I understand precompiled headers, and they have saved me a bunch of time in the past. I wish they would have talked about that aspect more as my understanding could be deepened.
PS - Mike Winiberg wrote: In part, the effort to maintain backwards compatability at almost any cost (despite the fiasco of i(o)streams and manipulators between versions 1,2 and 3) whilst adding ever more features adds huge amounts of technical debt that then has to be fought against in other ways.
This article I think demonstrates this nicely Speeding Up C++ Build Times | Figma Blog[^] I don't see that article supporting your assertion in any meaningful way. All it seems to be saying is that their coders haven't kept their headers clean, and often included unneeded headers that they had to take out. That doesn't seem to be speaking about the difficulties inherent in modern C++.
|
|
|
|
|
"I don't see that article supporting your assertion..."
Well, YMMV indeed, but it demonstrates quite nicely, I think, how - in an attempt to maintain backward compatability - the C++ environment has required ever more esoteric procedures to keep it usable both compiling the code and learning the language and its libraries etc. That one should need to have precompiled headers, write add-ons (which even google had to do) to make compilation times acceptable etc speaks very well to the increasing complexity of the whole ecosystem, in my view.
Don't get me wrong, I'm not saying it isn't a good powerful dev environment - my team developed a whole airline/shipping/freight system in C++, and financial applications using parallel C++ for market-maker using a Transputer farm. But I reached a point where I realised it was taking far longer (for us anyway) to learn how to make good use of the ever increasing new features than it was to solve the problems we were facing.
Having made extensive use of one feature (manipulators on streams) to control printing, only to have that broken - and hence needing a rewrite - in version 2, followed by a partial regression in version 3 (another rewrite!), then complete collapse of our system after a third-party database library we were using was updated and broke references (implementing them by copying FFS) we came to the conclusion we were spending more time fighting the language environment than writing software.
Switched to Java, then later to Python, and heve not used C++ in a meaningful manner since.
If there is one thing I have learnt in my altogether too-long time in software dev, it is that - with a few domain specific exceptions - the language you develop with is largely irrelevant, so the less it gets in the way of the task you want to accomplish, the more productive you can be.
|
|
|
|
|
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I agree with you that C++ is in danger of becoming a modern version of PL/1, containg something for everyone. I also agree that learning the entire language (and standard libraries) is becoming more and more difficult. However, C++ shares with C the philosophy that "if you don't use it, you don't pay for it" (or words to that effect). There is no reason that you can't use C++ as "C with classes", or at any other level between that and C++20/23/xx.
The language features I most use are RAII (exists from the ARM), templates & exceptions (C++98), threads & atomic variables (C++11), smart pointers (C++11?), and a few more advanced features (various, up to C++17). These have changed slighly over the years, but the changes are manageable.
Obviously, I use the standard library as well, but most changes to that have not been breaking changes.
Mike Winiberg wrote: Speeding Up C++ Build Times | Figma Blog[^]
Lastly, a decent developer spends most of his/her time on designing, writing code, and thinking about the code (debugging). The compilation time should be a small fraction of the total development time, and even that (as the article points out) may be optimized with a good design of your system.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Damn, that article is making the rounds !!
The (one of) problem with C++ is that it's an old language that needs to compete with more modern languages.
It can only be incrementally improved.
We underestimate how large the C++ code base is actually in production.
You can't just break backward compatibility (as much as I would like them to do it)
If that happens, many, many large organisations will never upgrade their toolsets.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
Mike Winiberg wrote: As an incredibly enthusiastic user of C++ from its very first release,
45 years. And still enthusiastic?
Not denigrating C++ but rather that it is difficult to keep the same enthusiasm for anything that long.
Mike Winiberg wrote: technical debt that then has to be fought against in other ways.
Get a very old set of libraries, say before templates, update them to work with your compiler and then just don't use anything new.
Mike Winiberg wrote: This article I think demonstrates this nicely
Far as I can tell that doesn't enumerate the size of the code base. But it certainly seems to suggest that the author is working with a large one.
And a legacy one at that.
So I don't really see anything that suggests that there is a problem with C++ but rather a process problem that has been allowed to grow without bound. And so they want to use technology to fix it. Rather than decoupling the code base to reduce the dependent sections.
|
|
|
|
|
A few vaguely related further comments: TL;DR - just my musings 8)
"If you don't use it you don't pay for it" is - IMHO - a good philosophy, and why C++ remains such a good dev environment for cutting edge stuff. It is indeed perfectly possible to use C++ as an 'improved' C and only adopt specific features - but surely in that case you might as well go for a dev environment that is more closely matched to the domain you are working in (if one exists of course)? Attempting to stick with C++ when it isn't a good fit can lead to the kind of thing you see today with web app dev: I regularly see devs posting about the 'stack' they use, sometimes comprising 20 or more different frameworks all bolted together using another framework, simply to create simple web pages that could be easily produced with just HTML and CSS (itself a bit of a dog's breakfast IMHO!).
---
Never has KISS been more appropriate than in some of today's dev processes. The interdependencies of such 'stacks' make the whole thing so fragile that one defect or change in the hosting environment can bring the whole thing tumbling down. Pile on the inappropriate adoption of 'Agile' processes and TDD and you end up with the kind of mess you see in the UK on so many GOV.UK web pages: The pages pass all their tests and hence are deemed 'correct' and yet don't perform as the users have a right to expect: I have one classic example that I have reported repeatedly but which has still not been fixed after many years: paying your vehicle excise duty:
You are sent a document with a 16 digit number in the same format as a credit card: 4 groups of 4 digits. When you type this into the online form, if you include the spaces and don't notice that the input box only accepts 16 characters (ie you lose the last 3 if you include the spaces), then complete the rest of the page, when you hit submit the page fails but the error message is displayed next to the box itself which by now is off the top of the screen. You have to scroll back up to see it! There is no warning in advance that the spaces must be omitted. Either accepting and ignoring or silently dropping the spaces is a very simple thing to implement but hasn't been done - a typical noob web dev mistake.
It gets better though: a bit later you have to enter your card details, which comprise a 16 digit number in four groups of 4. On this page you can enter this with or without spaces and it is accepted quite happily. By chance I met a dev working for GOV.UK on this stuff: he couldn't see the problem, the page passed all its tests...
---
With the exceptions of a very early device that interacted with live TV and the Transputer (I even helped develop an assembler for the Transputer as it didn't have one), I have never been involved with anything that needed the maximum possible performance from the underlying hardware; instead reliability and stability have always been the most important features (even for financial market-making stuff).
---
"old libraries": I much prefer, if at all possible, to work with open source tools, even though most of my work in recent years has been for Windows and SQL. At least that way you have the possibility of fixing things if the provider of the library goes silent (or even becomes malicious - see node.js passim), although the costs of doing so may be more than the project is worth! Sometimes - in a specific domain - that may simply not be possible.
IMHO the problem with C++ is not in the language itself per se; it has simply 'grown' so many features to maximise performance for all sorts of different use cases that the skillset and knowledge one needs to acquire to make the best use of it has become greater than that needed to solve many of the problems you might want to address - in other words it has become the domain of technical specialists rather than jobbing programmers (like me!).
---
As for enthusiasm, I have never lost that incredible excitement I first felt in my teens when I got a computer to do something I wanted by writing software for it - that's why, despite being well past retirement age I am still working - albeit part-time
|
|
|
|
|
C, in all it's variants (and that includes JAVA and RUST) is an abomination.
It should never have survived as long as it has.
It was only ever intended to be a 'bridging' language between the horror of Fortran and the "New" languages that were just around the corner.
|
|
|
|
|
I'm not sure I entirely agree with that view 8) but its utility and closeness to the metal gave it wings, especially in the early days of desktop computing when efficiency was very important, and once such systems become ubiquitous then one tends to get stuck with them.
See COBOL and indeed FORTRAN (never mind MS Word which is as unlike a word processor as it is possible to get whilst still handling text) for further examples...
|
|
|
|
|
I thought that might get a reaction.
But, my opinion still stands.
In the bad old days of 8086 through to at least 80386, you were much better off learning assembler.
Once Windows became ubiquitous, things got more difficult, but by then, we **should** have had better languages available than C and it's derivatives.
The few that were viable seem to have been killed off by the C people to protect their their pet little ecosystem
(But that does sound a bit like a conspiracy theory)
|
|
|
|
|
|
Member 14564709 wrote: C, in all it's variants (and that includes JAVA and RUST) is an abomination.
It should never have survived as long as it has.
PHP - wasn't intended to be a programming language in the first place.
Using the tiobe index and excluding that then removing all of the other languages that did originate with C one gets down to Basic, SQL and Fortran.
Member 14564709 wrote: between the horror of Fortran and the "New" languages that were just around the corner.
But it doesn't exist. People create a lot of new programming languages every year. And very few are ever generally accepted. So what you are suggesting will not or cannot exist.
|
|
|
|
|