The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
I'm talking about Cloud & the Hardware architecture abstraction.
I remember, back in 2000s when I joined a company as a rookie, a team of IT guys, all big experts sitting down and discussing the hardware configurations, Networking needed for a new project.
As things evolved, gradually, in the next companies I worked for, there were no datacentres. All the projects developed were deployed on cloud (Yep, typical start-up). And in more recent days teams are talking about "serverless" - which means you don't even get to read the configurations on paper.
Now I'm back to a mid size company having datacentre. but still most of the projects are in cloud and nobody gets a chance to discuss hardware.
What do you think? Cloud advancement is a positive evolution for developers?
or it's clearly dumbing down the brains w.r.t hardware architecture.
I am curious about your criteria for classifying a language as "higher that C (or even C++)".
I may be disagreeing with you, both in what makes a language "higher" and to which degree the language will "dumb down" the brain.
It could be "dumbing down" in a similar sense that car driving was "dumbed down" when the synchronized gear box appeared, and even more when automatic transmission became common. Shifting gears is not the problem of driving today (with the possible exception of on the racing track), it is not what distinguishes a good driver from a bad driver. It takes quite different qualities to become a good driver than the ability to hanlde an unsynchronized gear box.
An analogy: Programming languages went from all static allocation (Old time Fortran didn't even allow recursion; it didn't have a stack), to "pedal driven" (malloc/free) heap management, to automatic garbage collection. You could declare a Fortran array and hand craft functions for allocating fragments of it as if it were a heap. In C/C++, you can leave the "how" to the compiler and run time library, but your are yourself responsible for managing the allocated space to make sure it is properly disposed, and disposed only once. In newer languages, you don't have to be your own garbage man. Is leaving your garbage to a garbage collector an example of "dumbing down"?
Generally speaking, "dumbing down" is 98% how you use the language, regardless of the language itself. When I started programming, "structured languages" (such as Algol, Pascal, Simula, ...) were pushing Fortran to the side. Not every programmer got a grip on the high level flow constructs, and a common saying was that "You can do Fortran programming in any language". Until you learn to use higher abstraction mechanism properly, you won't benefit from them. Once you learn, they may help you make more robust programs, increase your productivity and create more readabe programs, which bear a much closer resemblance to the real world problem they attempt to solve.
The brain can only contain so much... If you need to fill it up with information on how to run servers, something else will have to go.
Software always develop to the complexity level the people working on it can handle (well, a bit more) somewhat efficient. Unskilled people reach that fast as they make complex solutions to simple problems - but no matter the experience level you will reach the limit.
So "Shut up and run my code" (a.k.a. serverless, a name making no sense, as it is running on.... wait for it.... SERVERS) is ideal in my book. Why should I care about load balancers etc. If I need 10 instances, of course there should be a load balancer in front of it, that does not require a meeting to find out. If I define job x needs to talk with job y, of course there need to be a network allowing this (and only this). For once Microsoft appears to have understood that with the original deprecated Azure roles, but no-one else understood and all the idiots headed for virtual machines in the clould (WTF).
Evolution always moves in the direction of ordered systems, so it's perfectly natural that systems with a lot of more complex but transparent background processing will take the place of less complex systems.
"The fittest", in the computer world, is measured by the number of users (which is essentially the same as in real-world evolution -- "survivor" apps and concepts prosper and proliferate), and are inevitably more complex underwater than their forebears, to allow greater range and/or simplicity for users, who will ratify their evolution by preferring and using the newer products.
"Dumbing down" doesn't enter in to it. A more advanced product/system must be easier to use, or it will not prosper and proliferate (yes, this means you, Linux desktop), just as a dim(ension) saw is easier to use than a manual ripsaw -- the carpenters who use dim saws are by no means dumber, they just have the opportunity to get more done in the same amount of time.
The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less (yes, this means you, windows).
I wanna be a eunuchs developer! Pass me a bread knife!
The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less
So ... The Cloud, then ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
I love the cloud!
I'm not "dumb", but server maintenance never was my thing.
I can create awesome apps, but I can't run them on a server because A) I don't have a server and B) even if I had one I wouldn't know how to configure it.
This is of course not so much an issue with companies, but it is for me.
For me, it means I can now give my customers software without having to hire a third party who does all the server stuff.
Perhaps back in the day, people could do both, but I'm having trouble enough keeping up with .NET Framework, .NET Standard, .NET Core 2.1, 2.2 (which wasn't an LTS release, which I learned the hard way), 3.0 (which also is not an LTS ), 3.1 (bingo, that's the one!) and now .NET 5 (will it, or will it not be LTS?).
And then I do have to know about certain Azure services, whether I do have on-premises hardware or not, because not all Azure services can be run on-premises and you can have both anyway.
Then there are all these new (versions of) frameworks and libraries that I have to keep up with because people are asking for them.
Let's not forget all that DevOps stuff (Jenkins, Azure DevOps...).
There's just too much software related stuff to ALSO keep up with server configuration.
I think the cloud offers real value for companies, software developers and system administrators alike.
Also, don't see it as a "dumbing down", but as "another thing to learn".
They aren't mutually exclusive and despite looking simple, cloud is anything but simple.
Nothing but respect for those who are experts at system administration (AND cloud) or software development (AND cloud) or are even just experts at cloud.
To give you a high level overview of a small Azure project I did for a customer:
- Create an Azure web app to run my .NET Core 2.2 (argh!) application.
- Create an Azure SQL database (with code first in the app) for my data.
- Set up a storage account and install file sync on the on-premises server.
- Create a couple of Azure Functions to async process the files.
- Use Azure Service Bus for async queueing.
- Manage Azure AD for access with on-premises sync.
- Deploy as much as possible using Azure DevOps.
- Keep secrets in Azure Key Vault.
That's quite a lot to know and manage, wouldn't call it "dumbing down"
Doing all this on-premises is possible (except for the serverless and key vault part, and you'd have to switch DevOps with TFS) and I agree that would be way more difficult.
The question is, do we really want it to be way more difficult?
Maybe "subscribing down"? In the old days, hardware servers became obsolete in 2 or 3 years, so a new one was needed to support new applications, devices, workloads. Software had to be upgraded/rewritten to accommodate the new hardware. Nowadays, not so much. So, since I can't sell you a new version of my software every year or so, I will sign you up to "rent" it (subscribe). Hardware folks looked at that and the cloud was born? New businesses looked at it as a way to avoid IT departments and server rooms. Older clients with the server rooms poohed it until time to replace expensive hardware. Like any other advancement, some old jobs go away, some new ones show up. Relax, be assimilated.
You do the same type of dealing, except it's with virtual machines. You specify capacity requirements, and a VM is generated to your spec. And as in real-life, it's either over or under-specified, and you pay accordingly. It's just all happens faster.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food