Our company is currently running .NET version 4.5 and using VB as the back-sript language. All of our forms are basically using the general ASP.NET forms methods. In other words we are not taking advantage of the any of the MVC core methods.
We are getting ready to start our new design process and wanted to get some recommendations on what framework and mythologies/C# version/database methods/Web UI methods, to use in our new design.
What skills does your team have / how much new learning are they able to do?
Are you intending to run the old stuff concurrently with the new stuff or is it a big bang approach?
What is the problem domain? That will dictate what are the best tools for the job
Are you willing to risk leading edge (which may be buggy) or do you want trailing edge (which may cause upgrade issues in the future) or do you want to play it safe (e.g. go with the next-to-latest version)?
Who are your clients / customers? What will they be most comfortable using?
Are there off-the-shelf packages that can do what you want which could save you a lot of development time?
Stevey T wrote:
recommendations on what framework and mythologies/C# version/database methods/Web UI methods, to use
In my company we had a system on .net framework platform. That was made from an orchestator, different rest API services on top of a monolithic oracle DB. With a huge plsql library.
We used distributed transaction for the operation (import of data). Because of the rollback possibilities if one request did fail e.t.c. And used a orchestrator in front of the rest API services (micro services) that organized the different requests all in one transaction.
We had to upgrade the system to .net core.
But .net core don't support DTC anymore !
Have anyone had a similar situation, and what what did you do about it?
Or do anyone have any comments to this issue/problem.? That if one request fails in a
big operation of many requests the database data will not be consistent, if no rollback is done.
When we use many "micro"services on top of one monolithic database.
We are thinking of building an internal Kaggle like platform to run Hackathons in our company. I am wondering what would be the best technologies to use for this and how to go about building such a site. Any help appreciated. Thanks.
I have always had the view that duplication is bad. There are always exceptions, but most of the time I it seems bad to me. Unit tests are one are where I tolerate it more but in production code it's something that I rarely find desirable.
Some people duplicate things two or three times and only eliminate the duplication on the third or fourth time. I really can't understand why somebody would blindly follow this rule. I understand the argument that we may not know how to refactor something if there are aren't enough instances of the duplication but I find that it is rarely the case. At least if you don't know how best to refactor something then keep it simple. The rule seems crazy as why would you do something if you know it's bad? Eliminating duplication is usually quick and, in my opinion, usually makes things much easier to read, particularly when you have half as much code to read/understand.
As an example, imagine we want to format a number as a currency. We could have the following
Having it once seems fine, but not really more than once. Surely, as a very simple refactor, something like the following would be better?
Surely the time saved by the readability would outweigh the code of writing it and it solves the problems of duplication.
Deliberately duplicating code doesn't make much sense to me, but maybe I've been working with people who have taken things a bit too far?
The principle of Do not Repeat Yourself (DRY) is one of those areas that is taken too far by some code zealots. Let's take your ToString example here, you notice after a couple of times that you have the same ToString code so you decide to introduce an AsCurrency method. That seems straightforward enough, but you're working in a large codebase so you don't notice that the same logic has been added in pieces of the code that you don't visit. Worse still, somebody has done this elsewhere:
Then there was that piece of code which looks like this.<pre>public class FormatConstants
publicconststring GBP = "£0.##"l
// .... other patterns removed for brevity
What we're seeing here is that others have attempted to avoid repeating code with varying degrees of success. In all of these cases, there is an element of repeated code because differnt people have taken different approaches to to avoid repeating code. Even if the code doesn't look exactly the same, you are repeating the intent of the code. Now you have introduced yet another way to represent this same conversion. In six months time, someone else comes along and has to add a currency ToString in a few places so they refactor their code to avoid adding repeated code. If you're lucky, they have looked through the codebase looking for other places that does the conversion and picks an already written one; if they've searched using ToString("£0.##") then they might not have found the match so they end up adding yet another new way of formatting this one item. What has happened here is that the search to remove duplication has ended up creating a mess - and this is just with a simple example.
The bottom line is, DRY is a great principle and one that you should try to stick to if it makes sense but you have to accept that, in some cases, you aren't going to achieve it and you shouldn't beat yourself up over it.