|
Quote: I'm not trying to be disrespectful
Message received the first time, no need but thanks.
Quote: C++ itself will be supplanted eventually, even in embedded, where the main target now is Rust.
Call me an Old Soul traditional/nostalgic but somehow I always find that older things built by way older men/women work better, are durable and yet so simple. Haven't looked at Rust since the beginning of time but I have heard good things about it from good folks. But I sincerely doubt anything can take over C++ in its domain.
Quote: Let me get you up to speed where we are in terms of app development. Tool selection is narrow, you either prefer performance or code readability (think Java/Android vs C++/NDK, or C# vs C++ on .Net). Backwards compatibility, legacy, dependency management, all of that is now native for every platform.
I understand your point, to clarify I am an Android/Java developer though probably not as nearly as experienced as you are. Any how, sometimes I wonder about the possibility of an application platform with C++ as the core language and an XML/XAML-like language for design. It shouldn't take much more than it did with Android/Java/XML to make that happen. Unfortunately I am all too young and inexperienced (at least now) to make something like that myself. I'm mainly running away from Java because of its Memory Management issues, I just want to be in control of keeping and releasing my memory. Otherwise Java is a beautiful language. If I ever did manage to bring C++ into Mobile/Desktop development then I'll probably take away multiple inheritance; its a curse and a blessing.
|
|
|
|
|
Quote: Call me an Old Soul
My fellow old soul I agree with you in principle, but in reality I'm looking more and more favorably towards Rust embedded. D language was a late disappointment and I don't think it'll ever take off. Rust people are now VERY ACTIVE in getting Rust to work with micro-controllers, resulting in performance in the same range as C++ (i.e., near perfect for almost everything) with all the safety checks and enforcements of Rust. Check out some of the stuff they're making with dependencies and abstracting the micros hardware, it's amazing.
Quote: Any how, sometimes I wonder about the possibility of an application platform with C++ as the core language and an XML/XAML-like language for design.
Open up visual studio, new project, C++ UWP with .net Core.
Welcome to UWP heaven
|
|
|
|
|
Also, most of my friends and colleagues insist that with anything less than 32 GB RAM, your computer will be slow as molasses. I open the Resource monitor and point at the bar showing that they currently, with lots of software running/waiting, 5 of 32 GB is in actual use, and most likely, 80% of even the "in use" part hasn't been addressed for at least half an hour.
And I point to the CPU display: Half of the cores are not at all active, the rest are 1-5% loaded, except in very short bursts. When I generate a digital video, or convert from one format to another, I manage to load my 6 core (with HT, making them appear as 12) to 95-100% for minutes at a time, but that is the only group of tasks where I would not be well served by a CPU with 1/10 the processing capacity.
|
|
|
|
|
And I thought Android Studio was a beast, that isn't practical to run with less than 8GB of RAM and a good SSD.
|
|
|
|
|
I developed an app once that had to discover and communicate with potentially thousands of computers concurrently. I used the .NET Task Library for the parallel processing of the TCP connections to those computers.
With networks of 5,000 computers, it ate up a lot of processor time with the concurrent communication. Aside from that, my observations are similar to yours - a lot of RAM and cores go unused, except for me when developing and testing cross-platform apps. Emulators can eat up a fair amount of RAM and processor time.
|
|
|
|
|
I give someone 1 hour to do a 10 minute task, they will take 1 hour.
This is something about how we fill out time, which also includes space. And NO does not apply to all people, but a large amount.
computer resources, I think we find ways to fill it. More ram and cpu, ohh i can get a 4k monitor now. now make that 2x4k monitors, computer can handle it. OS loads a bunch of stuff. Lots of ram, ill leave 2gb of recently used files, where only maybe 400mb maybe used every 10min, the remaining just used the once in that last 3 hours.
I find the same amazement with how much game devs got out of snes and other 90s games consoles. see Game Hut on youtube.
|
|
|
|
|
maze3 wrote: I give someone 1 hour to do a 10 minute task, they will take 1 hour.
If you're lucky.
More likely, they'll spend that hour procrastinating, thinking "I've got plenty of time", and then ask for an extension on the deadline.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Essentially went to the moon with slide rules. Astronauts carried a 6" aluminum Pickett. Man hasn't been back since the appearance of the PC. Now we just float about in LEO.
|
|
|
|
|
Randor wrote: but I somehow feel that something is wrong with the way we write software today. Programmers have gotten lazy. Once upon a time efficiency and accuracy were more important than time-to-market. The proliferation of programming tools and frameworks are to make the programmer's life easier and faster, but the developer is actually the least important stakeholder: making an application behave easily, predictably, accurately and efficiently for the END USER should be the primary goal!
|
|
|
|
|
Another trend I have noticed when comparing new versus established languages is the complaint that the established language 'is boring'.
|
|
|
|
|
For those interested in the history of Apollo and programming, look into the work of Margaret Hamilton who received the Presidential Medal of Freedom for her work for NASA and for the US military (SAGE early warning system). I trail blazing women for sure.
Kevin
|
|
|
|
|
Randor wrote: The hardware has become so much better... but I somehow feel that something is wrong with the way we write software today. No. It's still the same old game.
I still have my first computer with whopping 4k memory, a 256 byte ROM and a small risky CPU. You would not believe what we got running on this limited hardware. But that comes at a price. You must cut all corners that you can. Forget everything you ever heard about clean, maintainable code.
GOTOs (or branching instructions) are now your best friend. Avoid subroutines with their overhead for passing parameters, saving registers and returning results. Write highly specialized spaghetti code instead. It's, by today's standards, unmaintainable. Only the limited scope of your code makes this a reasonable idea. And it will be compact, even performant. Once you got it working.
Unfortunately there is no compromise. Make it compact and then it will not be very maintainable or the other way around. Memory vs. performance is a similar tradeoff, but here you have more room for compromises.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Randor wrote: They were working with the 16 bit Apollo Guidance Computer[^] with no floating point instructions and that was all they needed to get to the moon.
Of course. Contrary to some beliefs, the moon isn't floating, so why would you need floating point. Besides pure integer has no roundoff error.
CQ de W5ALT
Walt Fair, Jr., P. E.
Comport Computing
Specializing in Technical Engineering Software
|
|
|
|
|
Throwback Thursday: Hey mister, got the time? | Computerworld reminded me vividly of when I first faced this issue, when it truly mattered, because I needed to resolve conflicting data base updates that were recorded on two machines that ran in different time zones. Fast forward a decade, and I faced another version of the problem, when the time was recorded when DST was in effect, and reported when it was not, and vice versa. Worse yet was when someone started on a course at 2:15 AM CDT, and finished at 1:45 AM CST. How long did he work on it?
Soon afterwards, I converted every time stamp in the data base table to UTC, retaining the original local times for auditing, and changed the reports to use the UTC time stamps. Since then, I have always recorded UTC times in data base records.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
UTC - There can be only one!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
Forogar wrote: UTC - There can be only one!
Indeed; moreover, it's unambiguous.
Forogar wrote: I would love to change the world, but they won’t give me the source code.
So would I. Even if you got it, it's written in the undocumented assembler for an undocumented hyper-hyperthreaded processor that's also undocumented, and it's devoid of comments. Oh, yea, it's also massively multithreaded.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Quote: it's written in the undocumented assembler The way it works, you would think it was written in VB6!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
David A. Gray wrote: Forogar wrote: I would love to change the world, but they won’t give me the source code.
So would I. Even if you got it, it's written in the undocumented assembler for an undocumented hyper-hyperthreaded processor that's also undocumented, and it's devoid of comments. Oh, yea, it's also massively multithreaded.
Ahem: Off to Be the Wizard by Scott Meyer[^]
"God doesn't play dice" - Albert Einstein
"God not only plays dice, He sometimes throws the dices where they cannot be seen" - Niels Bohr
|
|
|
|
|
That book looks interesting, and hinges on a concept that has been the subject of several of my thought experiments.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Now, add this to the mix.
http: [^]
I shudder to think of the havoc this will play with critical systems throughout the sunshine state. Hopefully, if they do pass it, they will wait at least a year to implement. (will we need a new rule for dst (or in this case non-dst) adjusted display? (state != 'FL')
"Go forth into the source" - Neal Morse
|
|
|
|
|
I think he's been trying to make that happen for about as long as he's been in Congress. IMO, it needs to go the other way, and Daylight Saving Time needs to join the ranks of extinct species, along with the dodo bird. From my perspective, the only thing that Daylight Saving Time does is add unnecessary complexity to time keeping.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
I think we should eliminate time zones altogether, and just go by UTC.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
What is Internet Time?[^]
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
UTC - There can be only one!
I agree, people would quickly get used to whatever time they do things wherever they are.
An 8 O'clock Breakfast in New York would be at 13:00 (We should also switch to universal 24=hour clocks) and so on - but people would get used to it very quickly and wonder why the old, stupid, system ever existed.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
That would be OK with me. Certain groups, e. g., airline pilots and air traffic controllers, already do so, for all practical purposes, as does the US military.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|