|In researching my options for a new development endeavor based on my current, desktop, document management system for individuals, I once again turned to the ASP.NET MVC paradigm as consideration.
Having worked on a very large ASP.NET MVC application a number of years ago, I was already quite familiar with the general foundations of the MVC paradigm within the ASP.NET development environment. Thinking that this environment is the way to go for new web application, I began to re-familiarize myself with the technologies involved since I have been primarily working in WPF for the past several years.
After spending several weeks relearning these technologies I came to the same conclusion I did those years ago that the ASP.NET MVC paradigm is nothing more than an overly complex, inefficient way to build database intensive applications. The moment one needs to include complex, interface components, ASP.NET MVC provides little efficiency in doing so.
Currently, the ASP.NET MVC\Core tools environment has exploded from a limited set of useful adjunct tools (ie: jQuery) to such a plethora of such tools that one new to this type of development would have a very difficult time in attempting to understand where to start.
As one who has done an extensive amount of ASP.NET WebForms development, I often wonder what drove our profession to turn to such a complex architecture as ASP.NET MVC and now ASP.NET Core (which is just another name for MVC).
It seems to me, that my colleagues, young and old, were complacent enough to allow our primary development tools vendor, Microsoft, to set the conditions and terms of our development futures by simply accepting their technology roadmaps for the future. So how's that been working out for everyone? In a recent announcement, Microsoft has stated that both WebForms and Windows Communication Foundation will not be supported in the newer releases of the .NET Core Frameworks. Refining the .NET Framework is certainly understandable but eliminating core technologies which has thousands applications relying on the original technologies is certainly draconian to the companies that require such ongoing support. And what are they to do; invest hundreds of thousands of dollars to rewrite applications that are working well enough to be maintained in their original form simply because Microsoft no longer wants to support them. That's their job!
When Bill Gates was the CEO of Microsoft, we had a person who more or less was committed to the development community's needs and requirements.
To be sure, he wasn't perfect but for the most part he fell in line with the overall needs of our profession.
In 2010, when Stave Ballmer was starting to take the reins of Microsoft, ASP.NET MVC was introduced to the surprise of many of us ASP.NET WebForms developers who had experience with the Classic ASP development environment. Practically to a person, the reaction to this new paradigm was that of a giant step backwards. Little did we know at that time how truly backwards MVC was considering that its foundations were designed in the 1970s.
The Castle Project for quiet sometime had been freely offering its own ASP.NET MVC Open Source project, MonoRails, but it never found much interest in the Microsoft Development Community. ASP.NET WebForms, even with all its inefficiencies and issues, was still an environment where complex web applications could be easily developed and tested.
But the naysayers for this environment quickly enough got their voices heard and for some reason, many professionals began to listen. The claims by these professionals was that ASP.NET MVC offered developers a brand new environment with much greater control over their interface development processes. But in terms of efficiency and accomplishing actual development, such control was in reality a mirage.
We gave up easy to develop n-tiered application development for a range of technologies that in reality didn't really offer anything that we already had. Was it the fault of ASP.NET WebForms that many developers chose to implement massive amounts of code in the web page code-behind modules when most of it should have been placed in tiered libraries?
Certainly with the complexities of ASP.NET MVC and now Core, poor application development was just as much a danger to software quality as the original environment was. But wasn't that supposed to be the choice of the developer?
And what about this nonsense, which everyone touts as "a separation of concerns"? What, developers cannot understand how to do this without a new paradigm to help them along!?
Of course we have the claim that ASP.NET MVC allows for Test Driven development. So what? What type of tests is anyone going to write for a database intensive application where the majority of such work will be the correct updating of data or its retrieval? Developers can no longer understand their own results? Has anyone considered that try-catch blocks are the best way to capture erroneous processes? And what other way would one do so in the event?
In the 1990s a former chemical engineer turned software engineer wrote an excellent manual for developing top performing client-server and web applications. He demonstrated the correct way to create such applications all of which was based on efficient n-tiered architectures with all of his work bench-marked to demonstrate its credibility. To this day, I have not come across a single organization that actually implemented any of this person's constructs.
Lets take a single example from this man's work to demonstrate this. To increase application performance at the very least by 50%, one should divide their database systems to handle both OLTP transactions for updating and OLAP for data retrieval. This way, two sets of database servers (clustered or not) divide divide the work so that the data access type benefits from the most efficient way of processing them.
Too much work (and costs) for organizations to consider was the basic, unstated reason for ignoring such a design.
So the propagandists in our profession decided that manipulating all of the code would be better way to achieve such performance. Well, guess what? Hardware engineers and specialists in the 1990s also demonstrated that this was and still is complete nonsense. Developing high performance web applications is completely reliant on quality hardware that can perform well. No amount of code manipulation will make a web application work better if the underlying hardware is not there to support it. Of course, with bad code one can certainly make a web application perform poorly but with good code, the benefits to speed can only be maximized so much. Remember that the underlying architecture and protocols of the Internet are still the original ones designed in the 1970s.
In 2005 or a little later another book was produced by a Silicon Valley software engineer, entitled, High Performance ASP.NET Applications. This book corroborated the earlier hardware engineer findings by demonstrating that not only was proper hardware necessary for good performance but that its configuration (both hardware and software) was vital as well, which this book showed one how to do.
This book hardly made a ripple on our profession since most such implementations were handled by teams that had little to do with actual software development.
Over the years we have gone from one ridiculous, development paradigm to another, none of which really abided by classic and well researched software engineering standards. Instead it became the tools we used instead of how we actually developed our applications.
Just a few years ago, another software engineer took a medium sized ASP.NET WebForms application and converted it to the ASP.NET MVC environment to understand the performance differences since there has been really very little such research done in our profession (Unfortunately, I have since lost the link to this engineer's findings.) What this engineer found was that there really wasn't a significant difference between a decently written WebForms application and that of an MVC one.
I doubt that with all the latest tools from Microsoft and other third-party vendors this difference has really been greatly changed. Again, it goes back to the hardware for which recent years have seen great strides in the internal workings of the various chip-sets and disk drives.
So why ASP.NET MVC\Core???
Younger generations of developers really do not have the experience to make such comparisons while the older developers, of which there are really few of us left any longer, should have known better.
It seems to me that the real reason for the growth of the MVC paradigm had nothing to do with any significant increase in developer productivity or application performance. How could it be with the massive complexities that have become part and parcel to such development. As a result, this smacks of a foundation for job security more than anything else. The more complex a skill is, the fewer people can master it. And that means fewer quality developers for the hiring.
I mean, who in their right mind would use ASP.NET MVC\Core and then tout its benefits? And for whom by the way? Certainly it is not the corporations who allow its implementation since there isn't a single reason that anyone has espoused that provides for better development efficiency that a corporation could take advantage of. Does one really believe that a few saved milliseconds actually enhances a corporation's operational efficiencies?
In my long career, I have developed just about every type of application there is to develop within the business development environments with the few exceptions of such applications such as trading applications or the like. Nonetheless, using our original tools at the time, I and my fellow colleagues were able to produce high quality software that rarely if ever produced issues for our users.
And yet today, with all the wonders of software, technological advances, many web sites do not operate properly and many are increasingly being breached with huge amounts of critical data being appropriated. Any security specialist could tell you that part of this rampant criminality is the construct of the Cloud but also, simply the inherent complexity of what we use to develop, which provides fertile ground for increased defects and vulnerabilities for our applications.
ASP.NET WebForms may be dead and of course now shunned. However, had the professional development community insisted that Microsoft rectify its flaws, the company would have done so and was doing so until its shunning became too much of a burden for Microsoft to credibly maintain.
I believe that my profession has bet on the wrong horse. And with the stresses and complexities in web development today, it appears that this bad bet is causing us a lot of issues. From dissatisfaction in our work to contending with increasing amounts of defects, it may be time that my colleagues reconsider their decision to support such a cantankerous, development environment. Or maybe too many of you are simply to heavily invested to do so.
Sometimes, returning to the old ways is the best course of action for the future. And with the technological quagmires we have in our lives today, maybe overall simplicity should be the overriding concern for deciding upon a development technology.
Whatever the critics of this piece may say, they simply cannot deride the overall simplicity in development efficiency and the greater compartmentalization of such development that ASP.NET WebForms provided.
Who are you going to listen to; some engineer who is promoting a paradigm, an ideology, or someone who is promoting a tool that just got the job done and done well...
Sr. Software Engineer
Black Falcon Software, Inc.