I would like to thank you all for the contributions that you have provided my answer. I would like to know what sort of architecture can i use for delivering a business system to different clients. For example I might have, say a accounting system that i would like provide to different clients, which employees cloud computing concepts. I would really like to know what is the best architecture i can use for multi-company architecture systems using C# on the .NET platform.
I would like to thank you all for the contributions that you have provided my answer. I would like to know what sort of architecture can i use for delivering a business system to different clients. For example I might have, say a accounting system that i would like provide to different clients, which employees cloud computing concepts. I would really like to know what is the best architecture i can use for multi-company architecture systems.
I am interested in knowing how you design your Domain Model and its persistance layer.
Particularly I design the Domain Model completely independent from any other logical application layer. As a matter of fact, it is oblivious of any other logical layer.
In that case I do create a service layer that acts as a proxy and sends domain model objects to the persistence layer.
I've seen Dependency Injection/IoC to perform the persistence, having the interface defined in the Domain Model, but I never really liked this approach, because it creates a small dependence of interface implementation by the persistence layer.
Do you have "Save" methods on your Domain Objects?
I'd appreciate if you shared your thoughts.
To alcohol! The cause of, and solution to, all of life's problems - Homer Simpson
---- Our heads are round so our thoughts can change direction - Francis Picabia
I have a begginers design issue.
I'm building an layered application(Presentation layer, Business logic layer and Data base layer).
I have three entities in my application:customer,staff member and vendor that all of them deriving from person.
The user can add new customer,staff member or vendor to the data base(using UI).
My question is which layer is responsible for the next tasks:
1.Validating user data
2.Saving customer,staff member,vendor in to the DB.
3.Retrieving customer,staff member,vendor from the DB.
4.Perform other DB related manipulation with customer,staff member,vendor.
5.Which classes i need to define to perform the above tasks.
I'm new in to object oriented programming so thru the current question i am trying to understand the concept.
What are peoples' experiences with software design like?
I'm a firm advocate of the 6 P's: Proper Planning Prevents Piss Poor Performance, which to me means having a clear idea of what you want to achieve, and the necessary oversight to be able to make informed decisions.
For this purpose I find UML incredibly useful. Component or package diagrams help me determine my solution structure, decoupling of components, separation of concerns, reusability etc and completely prevent circular references.
Before I touch VS to set the solution up I make sure I've got this down, even if it's only a couple of projects and a test project. It's like I documented it before I even started.
I'm a bit baffled by some of my colleagues who expend a lot of unnecessary time and effort trying to keep their 100+ project solution in some sort of check without some sort of visualised overview. Just trying to add a single interface requires a three hour discussion involving everyone (6 developers, 18 person-hours, almost half a person-week) from two teams whereupon the only conclusion that's ever been reached in one of these productivity abyss meetings is "we need to add a new assembly" because their lack of oversight has made their solution so tightly coupled and interwoven that it's the only way short of taking everything apart and putting it back together again.
In my experience, when groups of developers have to resort to mentally storing lots of abstract information in the absence of a visualised overview, nobody can make informed architectural decisions and the solution grows organically and totally unchecked until the dependencies either get so complex that nobody knows what to do, or you run into a circular reference and people start throwing in arbitrary assemblies as a hacky workaround.
I've taken to going into discussions with the other group armed with a package diagram of their solution so I can tell them where to put what in their work. If I don't do this, each meeting descends into chaos with various people trying to compile lists or matrices of references which is exactly what the package diagram is representing. Afterwards you hear murmurs of "an overview diagram like that wouldn't be a bad thing", but nothing gets done about it unless I do it for them.
What do other people think?
How can I motivate my superiors to take software architecture seriously and value the advantages a bit of forethought will bring?
I am trying to make distributed application in RMI/CORBA like ebay : allow individuals to submit classified ads to sell items with an auction system. At the closing date of the auction, the buyers who bid the last (if it exists) has the privilege of being able to acquire the object.A user can at any time be buyer or seller. Information of "user" necessary for the rest are mainly: name, password, bank details.
As want to operate::
1: Get a list of items currently for sale (list or search by keyword possibly). Obtained including a description, a current price, a date (or time remaining) closing of the sale.
When an item is worth :
1: get a minimum of information about the seller
2: the bid there must have authenticated.
Can I get help and little explanation about the architecture?
I'm working on a new design for our old (25+) application. In this design should be more than one SQL servers (synchronized via replication). Each SQL server wrapped inside a DAL layer, and those DAL's are grouped using load balancing.
I looking to add cache to this design, and at first I thought that the best place to do so is at the individual DAL, however in this case I have to design a synchronization method between separated DALs.
To solve this synchronization problem I thought about a cache service to serve all the DALs (and maybe other parts of the design).
My question is, according to your knowledge and experience, will it be still effective to use a remote service to cache, or better to design cache synchronization that cross DALs?
We're developing a couple of web applications and want to allow users some advanced options if they've identified themselves.
The goal is that the user should only remember one username/password for all our applications (and services, we provide all kinds of newsletters and alerts as well).
SSO (Single-Sign On) was the first thing that came to mind, so my question is: what kind of recommendations can you give? I read a little about OpenId, but I know Google, Yahoo, windows live, ... also provides this.
Should we choose an existing service, and wich one is the best or should we write something for ourselves for our company only?
In the (near or further) future I would like to add the personnel as well through ldap or something.
This stuff is completely new to me so any advice, tutorials, recommendations would be helpful.
When I ask questions about architecture I often hear back several arguments to explain this or that choice.
* For maintenance
* For performance
* for Security
* For data consistency
What, for you, are valid arguments to make a decision?
Personally I place "data consistency" in number one and it is a non negotiable priority but I think I'm alone to think like that because the current pattern, decisions, modern architectures emphasize maintenance, performance and security.
I've been looking at messaging systems like Apache's [Qpid] and [RabbitMQ], but I'm wondering what any of you folks are using for cross platform development messaging?
Do you build mobile code that calls existing/standard web services (SOAP/WCF) that can also be utilized by existing desktop applications, or is there another way?
What I'm trying to understand is the best way for a set of mobile applications and desktop applications to be able to communicate (in regards to shared services) to send/receive data across these disparate systems. We're typically building RESTful services, but allowing mobile to call existing services is our primary goal. Secondary is abstracting the messaging away from source specific knowledge, preferably with an established third party messaging engine.
I have 2 application servers running my web application.
Application has a feature which upload a file. Now file upload is on server file system
There are 2 ways of doing this one is either have a file server where all files are uploaded so that whenever application is accessed all files are visible.
Other is to sync between 2 servers for files uploaded either through service or manually [clicking a button]
I like file server but then it will involve extra cost of setting up server and permissions on the folder is there any other way we can achieve this?
public class A : class B
//.... do some logics here
//.... do some logics here
class B : class C
class X : class Y
//.... do some logics here
//.... do some logics here
class Y : class Z
class A has the exact same method as class X does. Now the question is: How do I combine this method in centralised so that my code has DRY (don't repeat yourself) without modifying/touching class Y, Z and class B, C? Class B, C, Y, and Z are our legacy code and we are NOT trying to alter/modify these classes.