The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I'm talking about Cloud & the Hardware architecture abstraction.
I remember, back in 2000s when I joined a company as a rookie, a team of IT guys, all big experts sitting down and discussing the hardware configurations, Networking needed for a new project.
As things evolved, gradually, in the next companies I worked for, there were no datacentres. All the projects developed were deployed on cloud (Yep, typical start-up). And in more recent days teams are talking about "serverless" - which means you don't even get to read the configurations on paper.
Now I'm back to a mid size company having datacentre. but still most of the projects are in cloud and nobody gets a chance to discuss hardware.
What do you think? Cloud advancement is a positive evolution for developers?
or it's clearly dumbing down the brains w.r.t hardware architecture.
I am curious about your criteria for classifying a language as "higher that C (or even C++)".
I may be disagreeing with you, both in what makes a language "higher" and to which degree the language will "dumb down" the brain.
It could be "dumbing down" in a similar sense that car driving was "dumbed down" when the synchronized gear box appeared, and even more when automatic transmission became common. Shifting gears is not the problem of driving today (with the possible exception of on the racing track), it is not what distinguishes a good driver from a bad driver. It takes quite different qualities to become a good driver than the ability to hanlde an unsynchronized gear box.
An analogy: Programming languages went from all static allocation (Old time Fortran didn't even allow recursion; it didn't have a stack), to "pedal driven" (malloc/free) heap management, to automatic garbage collection. You could declare a Fortran array and hand craft functions for allocating fragments of it as if it were a heap. In C/C++, you can leave the "how" to the compiler and run time library, but your are yourself responsible for managing the allocated space to make sure it is properly disposed, and disposed only once. In newer languages, you don't have to be your own garbage man. Is leaving your garbage to a garbage collector an example of "dumbing down"?
Generally speaking, "dumbing down" is 98% how you use the language, regardless of the language itself. When I started programming, "structured languages" (such as Algol, Pascal, Simula, ...) were pushing Fortran to the side. Not every programmer got a grip on the high level flow constructs, and a common saying was that "You can do Fortran programming in any language". Until you learn to use higher abstraction mechanism properly, you won't benefit from them. Once you learn, they may help you make more robust programs, increase your productivity and create more readabe programs, which bear a much closer resemblance to the real world problem they attempt to solve.
The brain can only contain so much... If you need to fill it up with information on how to run servers, something else will have to go.
Software always develop to the complexity level the people working on it can handle (well, a bit more) somewhat efficient. Unskilled people reach that fast as they make complex solutions to simple problems - but no matter the experience level you will reach the limit.
So "Shut up and run my code" (a.k.a. serverless, a name making no sense, as it is running on.... wait for it.... SERVERS) is ideal in my book. Why should I care about load balancers etc. If I need 10 instances, of course there should be a load balancer in front of it, that does not require a meeting to find out. If I define job x needs to talk with job y, of course there need to be a network allowing this (and only this). For once Microsoft appears to have understood that with the original deprecated Azure roles, but no-one else understood and all the idiots headed for virtual machines in the clould (WTF).
Evolution always moves in the direction of ordered systems, so it's perfectly natural that systems with a lot of more complex but transparent background processing will take the place of less complex systems.
"The fittest", in the computer world, is measured by the number of users (which is essentially the same as in real-world evolution -- "survivor" apps and concepts prosper and proliferate), and are inevitably more complex underwater than their forebears, to allow greater range and/or simplicity for users, who will ratify their evolution by preferring and using the newer products.
"Dumbing down" doesn't enter in to it. A more advanced product/system must be easier to use, or it will not prosper and proliferate (yes, this means you, Linux desktop), just as a dim(ension) saw is easier to use than a manual ripsaw -- the carpenters who use dim saws are by no means dumber, they just have the opportunity to get more done in the same amount of time.
The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less (yes, this means you, windows).
I wanna be a eunuchs developer! Pass me a bread knife!
The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less
So ... The Cloud, then ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
Last Visit: 31-Dec-99 18:00 Last Update: 11-Apr-21 18:56