|
Thanks, André; you said what I was thinking.
Software Zen: delete this;
|
|
|
|
|
André Pereira wrote: assembly...completely unreadable and unmaintainable code. To those that are unfamiliar with the language, not unlike any other language. Well written assembly, along with good comments are important, as is the necessary hardware specific knowledge of the person looking at the code.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
Don't get me wrong, I think assembler is amazing in one way: if it doesn't work IT'S YOUR FAULT. I loved making simple games in assembler.
But even the embedded guys stay away from such beast, only reserving it for high-throughput applications, like bit banging. Otherwise, it's not worth effort, notwithstanding that most modern compilers do 99% of the job well. Long gone are the days of multi-page functions just to calculate an output. Bless you assembler.
|
|
|
|
|
André Pereira wrote: if it doesn't work IT'S YOUR FAULT Truth!
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
Quote: This is where you piss me off. You claim the best managed languages out there with literally millions of apps (all Android and UWP apps are in Java and C#, respectively), aren't needed.
I just considered Java/C#/VB and other with C++ and thought they all can be eventually replaced with C++. And why not, its not like you can't have C++ libraries. From what I understand the reason Python has become so powerful and is currently being accepted so actively nowadays is because it uses native C++ written libraries.
Quote: 4,5,6,7. Then you spout a bunch of random web jargon words and claim these are way more important than kernel or app languages.
I didn't actually, but I can see why you thought that. I took into account the user-base of different languages and presented a list. I also took into account the general like and dislike towards these languages; the reason I said Java/C# may or may not be excluded. I am a Java developer and last thing I want is to kill Java, cause otherwise I'd need to learn either C# or C++ since other languages drive me nuts. While making the list I realized that there are just too many languages out there that I haven't personally used, so I kept my list to the most used/popular ones.
Also I decided to include different sets of languages for front-end and back-end development, since most of the devs I know are happier and prefer it this way. My humblest apologies if you thought I am a non-native advocate. I am in fact the exact opposite.
So in the end I came up with a simple list that describes the different levels at which these languages may be needed. The same reason I left out 8.
Quote: 8. Only Tiobe and managers consider SQL a programming language.
I have no idea what Tiobe means; but I get your gist, in my defense I only reluctantly added it knowing full well it is not a language but it still requires a different set of developers.
P.S. SQL point was actually 7th.
|
|
|
|
|
First of all, thanks for being a good sport , and let me reitirate: Quote: I'm not trying to be disrespectful
Quote: just considered Java/C#/VB and other with C++ and thought they all can be eventually replaced with C++. And why not, its not like you can't have C++ libraries. From what I understand the reason Python has become so powerful and is currently being accepted so actively nowadays is because it uses native C++ written libraries.
Nope, C++ itself will be supplanted eventually, even in embedded, where the main target now is Rust. Java and C# in particular are just about perfect for app making, falling back to C++ (messier code) modules for performance critical sections, just like you said. As for Python, I don't think the huge performance loss compensates a (arguably) cleaner code, or the development overhead of finding/making C++ native libraries.
Quote: hile making the list I realized that there are just too many languages out there that I haven't personally used, so I kept my list to the most used/popular ones.
I get your message, and fact is yes, sometimes we seem to have so many tools. Let me get you up to speed where we are in terms of app development. Tool selection is narrow, you either prefer performance or code readability (think Java/Android vs C++/NDK, or C# vs C++ on .Net). Backwards compatibility, legacy, dependency management, all of that is now native for every platform.
So there's no need to continuosly reinvent the wheel, like happened to native development for 20 years (you had a menu of frameworks to choose from, all very bad in some way), and what's still happening in the web world.
Besides, I used "web jargon" in jest, because that's what it looks like to outsiders :P
Quote: Also I decided to include different sets of languages for front-end and back-end development, since most of the devs I know are happier and prefer it this way. My humblest apologies if you thought I am a non-native advocate. I am in fact the exact opposite.
No need to apologize, I was the one ranting. But for me, just the fact that you even have to think about sets of languages for front-end and back-end makes me uncomfortable.
|
|
|
|
|
Quote: I'm not trying to be disrespectful
Message received the first time, no need but thanks.
Quote: C++ itself will be supplanted eventually, even in embedded, where the main target now is Rust.
Call me an Old Soul traditional/nostalgic but somehow I always find that older things built by way older men/women work better, are durable and yet so simple. Haven't looked at Rust since the beginning of time but I have heard good things about it from good folks. But I sincerely doubt anything can take over C++ in its domain.
Quote: Let me get you up to speed where we are in terms of app development. Tool selection is narrow, you either prefer performance or code readability (think Java/Android vs C++/NDK, or C# vs C++ on .Net). Backwards compatibility, legacy, dependency management, all of that is now native for every platform.
I understand your point, to clarify I am an Android/Java developer though probably not as nearly as experienced as you are. Any how, sometimes I wonder about the possibility of an application platform with C++ as the core language and an XML/XAML-like language for design. It shouldn't take much more than it did with Android/Java/XML to make that happen. Unfortunately I am all too young and inexperienced (at least now) to make something like that myself. I'm mainly running away from Java because of its Memory Management issues, I just want to be in control of keeping and releasing my memory. Otherwise Java is a beautiful language. If I ever did manage to bring C++ into Mobile/Desktop development then I'll probably take away multiple inheritance; its a curse and a blessing.
|
|
|
|
|
Quote: Call me an Old Soul
My fellow old soul I agree with you in principle, but in reality I'm looking more and more favorably towards Rust embedded. D language was a late disappointment and I don't think it'll ever take off. Rust people are now VERY ACTIVE in getting Rust to work with micro-controllers, resulting in performance in the same range as C++ (i.e., near perfect for almost everything) with all the safety checks and enforcements of Rust. Check out some of the stuff they're making with dependencies and abstracting the micros hardware, it's amazing.
Quote: Any how, sometimes I wonder about the possibility of an application platform with C++ as the core language and an XML/XAML-like language for design.
Open up visual studio, new project, C++ UWP with .net Core.
Welcome to UWP heaven
|
|
|
|
|
Also, most of my friends and colleagues insist that with anything less than 32 GB RAM, your computer will be slow as molasses. I open the Resource monitor and point at the bar showing that they currently, with lots of software running/waiting, 5 of 32 GB is in actual use, and most likely, 80% of even the "in use" part hasn't been addressed for at least half an hour.
And I point to the CPU display: Half of the cores are not at all active, the rest are 1-5% loaded, except in very short bursts. When I generate a digital video, or convert from one format to another, I manage to load my 6 core (with HT, making them appear as 12) to 95-100% for minutes at a time, but that is the only group of tasks where I would not be well served by a CPU with 1/10 the processing capacity.
|
|
|
|
|
And I thought Android Studio was a beast, that isn't practical to run with less than 8GB of RAM and a good SSD.
|
|
|
|
|
I developed an app once that had to discover and communicate with potentially thousands of computers concurrently. I used the .NET Task Library for the parallel processing of the TCP connections to those computers.
With networks of 5,000 computers, it ate up a lot of processor time with the concurrent communication. Aside from that, my observations are similar to yours - a lot of RAM and cores go unused, except for me when developing and testing cross-platform apps. Emulators can eat up a fair amount of RAM and processor time.
|
|
|
|
|
I give someone 1 hour to do a 10 minute task, they will take 1 hour.
This is something about how we fill out time, which also includes space. And NO does not apply to all people, but a large amount.
computer resources, I think we find ways to fill it. More ram and cpu, ohh i can get a 4k monitor now. now make that 2x4k monitors, computer can handle it. OS loads a bunch of stuff. Lots of ram, ill leave 2gb of recently used files, where only maybe 400mb maybe used every 10min, the remaining just used the once in that last 3 hours.
I find the same amazement with how much game devs got out of snes and other 90s games consoles. see Game Hut on youtube.
|
|
|
|
|
maze3 wrote: I give someone 1 hour to do a 10 minute task, they will take 1 hour.
If you're lucky.
More likely, they'll spend that hour procrastinating, thinking "I've got plenty of time", and then ask for an extension on the deadline.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Essentially went to the moon with slide rules. Astronauts carried a 6" aluminum Pickett. Man hasn't been back since the appearance of the PC. Now we just float about in LEO.
|
|
|
|
|
Randor wrote: but I somehow feel that something is wrong with the way we write software today. Programmers have gotten lazy. Once upon a time efficiency and accuracy were more important than time-to-market. The proliferation of programming tools and frameworks are to make the programmer's life easier and faster, but the developer is actually the least important stakeholder: making an application behave easily, predictably, accurately and efficiently for the END USER should be the primary goal!
|
|
|
|
|
Another trend I have noticed when comparing new versus established languages is the complaint that the established language 'is boring'.
|
|
|
|
|
For those interested in the history of Apollo and programming, look into the work of Margaret Hamilton who received the Presidential Medal of Freedom for her work for NASA and for the US military (SAGE early warning system). I trail blazing women for sure.
Kevin
|
|
|
|
|
Randor wrote: The hardware has become so much better... but I somehow feel that something is wrong with the way we write software today. No. It's still the same old game.
I still have my first computer with whopping 4k memory, a 256 byte ROM and a small risky CPU. You would not believe what we got running on this limited hardware. But that comes at a price. You must cut all corners that you can. Forget everything you ever heard about clean, maintainable code.
GOTOs (or branching instructions) are now your best friend. Avoid subroutines with their overhead for passing parameters, saving registers and returning results. Write highly specialized spaghetti code instead. It's, by today's standards, unmaintainable. Only the limited scope of your code makes this a reasonable idea. And it will be compact, even performant. Once you got it working.
Unfortunately there is no compromise. Make it compact and then it will not be very maintainable or the other way around. Memory vs. performance is a similar tradeoff, but here you have more room for compromises.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Randor wrote: They were working with the 16 bit Apollo Guidance Computer[^] with no floating point instructions and that was all they needed to get to the moon.
Of course. Contrary to some beliefs, the moon isn't floating, so why would you need floating point. Besides pure integer has no roundoff error.
CQ de W5ALT
Walt Fair, Jr., P. E.
Comport Computing
Specializing in Technical Engineering Software
|
|
|
|
|
Throwback Thursday: Hey mister, got the time? | Computerworld reminded me vividly of when I first faced this issue, when it truly mattered, because I needed to resolve conflicting data base updates that were recorded on two machines that ran in different time zones. Fast forward a decade, and I faced another version of the problem, when the time was recorded when DST was in effect, and reported when it was not, and vice versa. Worse yet was when someone started on a course at 2:15 AM CDT, and finished at 1:45 AM CST. How long did he work on it?
Soon afterwards, I converted every time stamp in the data base table to UTC, retaining the original local times for auditing, and changed the reports to use the UTC time stamps. Since then, I have always recorded UTC times in data base records.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
UTC - There can be only one!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
Forogar wrote: UTC - There can be only one!
Indeed; moreover, it's unambiguous.
Forogar wrote: I would love to change the world, but they won’t give me the source code.
So would I. Even if you got it, it's written in the undocumented assembler for an undocumented hyper-hyperthreaded processor that's also undocumented, and it's devoid of comments. Oh, yea, it's also massively multithreaded.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Quote: it's written in the undocumented assembler The way it works, you would think it was written in VB6!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
David A. Gray wrote: Forogar wrote: I would love to change the world, but they won’t give me the source code.
So would I. Even if you got it, it's written in the undocumented assembler for an undocumented hyper-hyperthreaded processor that's also undocumented, and it's devoid of comments. Oh, yea, it's also massively multithreaded.
Ahem: Off to Be the Wizard by Scott Meyer[^]
"God doesn't play dice" - Albert Einstein
"God not only plays dice, He sometimes throws the dices where they cannot be seen" - Niels Bohr
|
|
|
|
|
That book looks interesting, and hinges on a concept that has been the subject of several of my thought experiments.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|