The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
Web development just suxx. I don't know even one reason all those idiots named "managers" decide to make "web application"! All they say "it's trend". No real requirement, because most of people run Windows. Even if you DREAM you'll use it on Android, just forget - top 20 of most used mobile programs are... ENTERTAINMENT!!! Nobody wastes his battery on login to accounting system being out of office!! Remember that before you WASTE time on web sh**t.
85% of what is on a smart-phone is just entertainment and porn. So you are right to complain about the over use of web application development.
Smart organizations would return to the client-server model (which by the web is not) as the majority of applications developed only require department level concurrency capabilities and are often closed to a department...
Sr. Software Engineer
Black Falcon Software, Inc.
To answer your question about client-server, it is mostly about not needing an application server as the web requires. Also, client-server systems are often closed-loop systems as a result of the fact that only a limited group of people would use these systems. This group could be as large as several thousand people or only a few. However, in all cases, such systems will be accessed only from within an institution and would not be available for public use, which is the main advantage of web development.
In terms of an application server as it is viewed from a web development perspective, it is quite different in the client-server model as multiple such servers could exist based upon the number of physical tiers one wants to develop.
Therefore, you would have an interface tier, which would primarily be a thin-client in Windows or WPF whose only function would be to retrieve and display data. A business-logic tier that would support all of the application's functionality as it relates to the handling of data, a data-access tier that would house a data-access-layer that would do the actual retrieval and updating of data, and finally the database tier where the database engine is run and maintains the data stores.
Such a topology has been proven over the years to be highly efficient. And the more users that need access to the system, the more tiered servers you could add, making efficiency and performance a completely hardware-based construct.
Because everything is developed with a minimum of technologies (ie: languages for example; you only need your development language of choice and your database language (SQL)), complexity is substantially reduced.
Web development was originally supposed to be based off of such a technology model but organizations became very complacent and lazy in terms of their architectures whereby you had the browser interface, an application server that handled literally everything and a database engine.
Some organizations did follow the client-server model but not many which gave rise to performance issues. As a result, technology elites and evangelists who were always promoting the latest and greatest tools and technologies decided that everything could be made more efficient through code. Thus, ASP.NET WebForms was derided based on claims about it that, though true, were not true to the extent that the platform was of no value any longer. However, that was the plan; get rid of WebForms (which was actually based upon the client-server model) and force developers to use far more complexity than was required based upon the false assumption that efficiency was code-based. It is not...
On Java, well this environment was also designed around the client-server model in somewhat of a similar fashion to that of WebForms (However, its comparable equivalent was that of Java Servlets top an extent), though it was always based upon the MVC paradigm. Besides the complexities of the Java Language itself, its MVC paradigm was also a contributing factor to the inherent Java environment complexity. Finally, the Java technologies were designed specifically for large-scale environments where it is basically found its niche.
To some extent it was believed that if Microsoft development mimicked Java development it too could handle large-scale systems. But here too a myth was developed. The Micrososft Community already had the Java-like MVC paradigm in the third-party's Castle project's MonoRails System, which was completely free for anyone to use. However, the technical community really had no interest in it until Microsoft started promoting the MVC paradigm itself.
Efficiency in any type of distributed system is first, always based upon hardware configurations. Second, efficiency in such systems is based upon concurrency (number of users able to access a system comfortably), not speed. If you require more efficiency, you simply build out your hardware tiers horizontally.
A former Silicon Valley systems engineer wrote a book that described how to create blistering fast ASP.NET WebForms applications. I read through the entire book and practically over 95% of the recommendations were all hardware and configuration based. Very little had to do with coding technologies.
However, corporations did not want to spend monies on efficiency so technology evangelists convinced them that such efficiency can be attained through the use of a different form of web development, which is now primarily the MVC paradigm.
We very senior personnel could not see this as a viable solution as it for one, added a tremendous amount of complexity to the development process which consequently made technical personnel less efficient at their jobs. But of course you always had that know-it-all manager that could tell you that it was not the technologies that were the problem but the personnel implementing them.
All of this was well known in the 1990s before things got to their current state. Reports and analysis by hardware engineers and system-based software engineers were made available for anyone to understand this. However, if money wasn't there for hardware it could then be supplemented through the use of Open Source and other avenues of software procurement.
Everything is thus interconnected as a result of management stupidity and technical agendas, neither of which serve the best interests of a company. But no one really cares about that except for the honorable people who work in companies.
In general, the majority of applications today can be based upon the closed-loop architecture of the original client-server tiered model of development. But hey, that is not cool. Everything has to be thrown on to the Internet or Intranet with the idea that it is much simpler to deploy, compared to client-server applications. This may have been true years ago but today the web implementation has become so complex that the single deployment of an application on a server for the web no longer is a viable argument for its use...
Sr. Software Engineer
Black Falcon Software, Inc.
I don't use EF. What little time it takes me to code my own DO classes and build in anything like value transformation, try-catch, validation, etc. is minor compared to the problems it either avoids or reduces debug times down the road.
I am NOT saying it is wrong to use it, just that I have not seen an explanation that gives me reason to use EF.
I'm not giving the answer. You're never gonna make me take THAT in my mouth!
Anything that is unrelated to elephants is irrelephant Anonymous - The problem with quotes on the internet is that you can never tell if they're genuine Winston Churchill, 1944 - Never argue with a fool. Onlookers may not be able to tell the difference. Mark Twain