|
Jon Rista wrote: They are difficult to neatly package up into well-isolated "areas"
I guess I don't understand, I have never run into any unusual problems isolating the concerns you specified. Here I use the word isolate regarding both a design and packaging perspective, which avoids any circular issues.
led mike
|
|
|
|
|
I mean it from the perspective of use I guess:
When you write a Controller, you put that controller in a project that is part of the "presentation layer" of your application.
All controllers will be in the presentation layer...you won't have any in the business or data access layers.
You have neatly grouped all your controllers into a single area, and they will only be USED by other things in your presentation layer.
When you write a Business object, you put that service in a project that is part of the "business layer" of your application.
All services will be in your business layer...you won't have any in the data access layer or in the presentation layer.
You will transform business objects or graphs into DTO's for transfer accross the wire to your presentation layer and back.
You have neatly grouped all your business objects into a single area, and they will only be USED by other things in your business layer.
Cross-cutting concerns can not be isolated like that:
When you write a logging framework, you put the logger class and its support types in a "shared" area.
Your logging functionality is neatly encapsulated in the shared area of your application.
Your logger will be used in MANY places, accross your layers, including presentation, business, and data access layers.
Your logger's use is not neatly grouped, it is scattered throughout your application, and there is no way to keep usage of logging, a cross-cutting concern, isolated to a single area of your application.
|
|
|
|
|
Jon Rista wrote: Cross-cutting concerns can not be isolated like that:
Ah, yes because then it would not be cross-cutting. I thought we were talking about the library design creating the circular problem not the user code creating the circular problem. My bad.
led mike
|
|
|
|
|
No problem. I think I kind of confused the issue anyway by using the term "packaging". :P
I've been delving deep into AOP with the PostSharp framework lately, so my perspective on things is a bit skewed from the norm right now. From the AOP perspective, aspects kind of referr to usages. If full AOP capabilities existed in .NET, you could "package up" aspects and apply them to multiple points in your code using external means. This allows a more appropriate separation of concerns, allowing you to neatly isolate cross-cutting concerns from core concerns. PostSharp doesn't yet allow such flexibility in applying aspects, but its getting close.
|
|
|
|
|
That sounds cool. It was nice to talk with someone that can speak design for a change. Thanks.
Have a great weekend
led mike
|
|
|
|
|
led mike wrote: It was nice to talk with someone that can speak design for a change. Thanks.
Agreed!
|
|
|
|
|
Great discussion guys. Thanks for the comments and sorry I was away for a while... this tab got lost within 40 others in my Firefox window...
For now, what we wound up doing is something like this:
Global.Common assembly contains base classes for Validation.
Logic to load validation error message strings is different depending on what "area" of the application you are calling validation from, ie. UI vs. BLL
Each of our "areas" also has an assembly for common code (shared among the assemblies in that area) so we have:
UI.Common and
BLL.Common
For the validation example... we created a validation class in UI.Common that inherits from Global.Common, but provides its own implementation that loads the message strings from our BLL by calling through WCF.
We will create another validation class in the BLL.Common that inherits from Global.Common, but this implementation will call the BLL methods directly to load the message strings, without going through WCF.
This way, the cross-cutting concerns are kept within the 3 Common assemblies and uses a provider model idea for the loading of message resources.
|
|
|
|
|
Hello,
I am creating a windows service which performs 20 operations based on some condition. So at one time one process will be executed.
All these 20 different tasks are separate assemblies added to the windows service. The objects and the call to the assembly will be made only when needed.
For example:
Case "findcustomer"
create object
process
Case "findVendor"
create object
process
Is this approach a good idea? Also I want to know what needs to be keep in mind related to design of this service.
Regards,
Pavas
|
|
|
|
|
Make sure you work out what the Start, Stop, Pause and Resume methods that the Service Manager allows to be called on the Service will actually do. For example to call Stop, do you need to know which 'process' you are running so it can be stopped? Does it take time to stop? Or maybe you can't tell if a process is running so then you need to find out? Do you realy need Pause and Resume? If so what will they do?
|
|
|
|
|
Why the subject is saying about Web Service, you are actually designing a windows service..?
Moim Hossain
R&D Project Manager
BlueCielo ECM Solutions BV
|
|
|
|
|
That is a mistake. Sorry for that.
Regards,
Pavas
|
|
|
|
|
My job role has changed significantly as we have started offshoring most of our coding. Since I have domain knowledge and good relationships with the client, my responsibilities have been relegated to mostly doing design work. We are getting ready to start a fairly large project and I will be overseeing 3-4 offshore developers.
Previously, the company I work for has not placed a lot of emphasis on documenting design other than documenting UI and database design, etc. It is critical that designs now be detailed enough that offshore developers can code new design with minimal oversight.
I have two questions:
1) What type of training materials would any of you suggest? Specifically ones that focus on communicating design (all other parts of the SPLC are managed quite well).
2) What tools would you suggest to aid in this process?
Thanks in advance,
Kevin
|
|
|
|
|
I'll suggest that you start off by investing in a decent enterprise architecture tool. This will help you to document sequence diagrams, activity diagrams, etc. When it comes to communicating design, we find the RUP process to be helpful - use cases, detailed designs, storyboards, etc... You may want to read up on RUP to get a better understanding of what you will be producing.
|
|
|
|
|
Read books on Software Analysis and Design. Some tools which come to mind are Rational Rose, MS Visio, and buy a roll of paper from a meat market (yes the one they use for wrapping meat). You can sketch diagrams using this paper.
|
|
|
|
|
Thanks for the info. I've been introduced to Rational in the past, but I'd be fighting an uphill battle to obtain licensing for it. We are pretty much limited to Microsoft products. I know that there are a lot of features in Visio that I'm not aware of, so that might be a good starting point. Is there anything in Visual Studio Team System that would be helpful? I've watched some demos on the product and I see that it has static code analysis, profiling, testing support etc, but it doesn't appear that it would be helpful for what I'm looking for. Automatic code generation doesn't work well for us either, since we are dealing with existing c++ applications and we generally don't create a lot of new classes.
Drawing stuff out on paper (or a white board) is always a good idea. With this offshore design, we are finding that we need a high level of detail in the design. I think what would be helpful would be able to define classes and their member functions and include pseudo code with the member functions. I'm trying to get away from Microsoft Word. I find my self spending more time fighting Microsoft Word than I do thinking about design. I know Rational has some ability to generate some documentation. I think that alone would boost my productivity considerably.
|
|
|
|
|
Try StarUML which is an open source and really worth it.
|
|
|
|
|
Thanks. I took a look at StarUML. It feels like a good tool, but I seem to be finding UML to be quite unsatisfying. It seems too generalized. When I think of Object Oriented design, I still think in terms of programming. I think of words like inherits, derives, uses, interfaces, references, etc. UML allows you to model this infomation through generalizations, associations, compositions, etc, but these terms seem to generalized and difficult to understand. I would think that if I was creating a class diagram, I would want to use programming terms.
What I think would be a great tool would be if you had essentially a class diagram. In each class, I could then define members and methods. You would then be able to described each member, describe the purpose of each method. Then it would be nice to be able to include psuedo code inside the class methods. This would all be done inside of some modeling environment.
Then, when I'm done, I could generate a word document that would look similiar to what an API documentation would look like, complete with the descriptions and pseudo code. Then I can hand off the document to programmers that could implement it.
I could do all this in MS Word, but it takes too much time formatting and fighting MS Word. I also have to do everything twice. Create the diagram in Visio, then write everything in WORD. Visio has an report tool, but it fills the document with UML lingo. It looks as if you can create Word templates for StarUML, but I couldn't figure out how to document the operations, nor do I have the first clue on how to create a suitable template.
Does such a tool exist?
|
|
|
|
|
I did find the "Methods" button in Visio. I've also found that I can customize much of what goes into the report document. I might be on to something here.
|
|
|
|
|
Lets say I have this hypothetical database (the one I am dealing with is much bigger as you probably can imagine or guess):
Customers
---------
ID
Name
Address
ShippingAddress
OrderID
Orders
------
OrderID
Date
LineItems
---------
ItemID,OrderID
Quantity
Items
-----
ItemID
Description
I have many applications which need customer's ID, Name, and Address, Items' ID, Description. Is it better to create a library which can load the customer ID, Name, and Address and same for the Items by invoking a stored procedure or is it a better design to have a stored procedure which will get whatever I need for the specific application. For example, one application might need customer's id, name, and the number of orders and it might need what items we carry. However, another application might need all the columns of customer and Items. Because too many applications need customers' ID, Name and Adddress, should I create a static class class with static method which invokes a stored procedure and gets the information and if I need more information in other application I simply get those as per need basis, or should I just skip creating a static class library and get everything when I need them by invoking some stored procedure which returns all I need.
If I create a library, the advantage is I can use it in many applications and get the extra columns if I need to (but at least a few apps will not need the extra columns so simply calling the static methods will do the job). However, if I need other columns then I have to get those separately. If I do not create a library, I do not have the advantage of code reuse but I can get whatever fields I need and load them as need for the application all in one shot. I am wondering which solution is more advantageous over the other? Or, is there a better way?
Thanks,
|
|
|
|
|
Some opinions:
I think you should use stored procedures or views (which ever you prefer) to get the data. You can combine several needs to a single procedure, but I would think the procedure output as an interface (technically not the same as in .Net but the idea). If there are several needs which are almost identical and they don't change the logic of the output and there's no performance penalty, I would combine those.
About the static class. This is basically caching and it's a good way to ensure performance. However you must consider when to refresh the data in the static class. When and how are the modifications fetched to the class. You might want to use SqlDependency[^] if it's SQL Server you're using.
One thing you didn't mention at all was security. I don't know if it's an issue but is it ok that all the data can be used by clients. If not, that consideration should be part of the process when you think what procedures you'll create.
|
|
|
|
|
Security is not an issue. What I am trying to find out is:
Is it a better idea if I make a library which other developers can use by simply calling the static method which will return CustomerID, Name, and Address. The static method will return a DataTable by invoking a stored procedure. If the developer needs other fields they can worry about that themselves and retrieve them. Should I create a library or am I just wasting time doing this? The thing is many applications need these fields so I thought a library will ease things up.
Of course, the alternative is to let every developer create their own DAL and stored procedures and get whatever they need. The problem is 10 different applications may simply need CustomerID, Name, and Address and thus code will need to be written within each app's DAL. This is a waste of time. What do you think?
|
|
|
|
|
I wouldn't think that creating the library is a waste of time at all. One thing that's obvious benefit is that you get reusable code. Also considering the application maintenance you would have a single point to maintain and the benefits in performance, caching, services (methods etc), structural changes in data etc. would be usable for all. For example if later few more columns are needed by many application, simply adding them both to the procedure and to the class once would do the trick without rewriting several DAL's.
One thing I'm thinking of is that why you would take only few of the columns instead of all. If the application needs only few, you can strip the unnecessary columns away when the application requests for data or the application can do it if necessary. To go a bit further you could get some data around the customer (like number of orders or whatever) and if the app requests it you either fetch it or return if it's cached.
As said this is just an opinion but I'm used to the idea that only single DAL exists against one database. In my experience, it makes things a whole lot easier when you have to change something. Of course this DAL is cumulative so that when new tables or applications are created, the DAL grows but still it's "centralized". Although the development environment in my cases may be very different from yours.
|
|
|
|
|
Given the fact that you have a variety of usage scenarios to fill, I can't stress enough the benefits that and O/RM (Object Relational Mapper) can offer you. Building strict API's in a database with Stored Procs is a very, very maintenance heavy way to go (in most situations, but particularly so when you have a broad set of usage scenarios). I highly recommend that, even if you end up choosing not to use an O/RM, that you check out LINQ to SQL. Its simple, very lightweight, very fast (L2S queries have caused devs and dba's surprise when they find out that their 'highly optimized' query actually performs worse, sometimes much worse, than a corresponding L2S query), and offers a degree of flexability that you could find extremely useful for your needs.
Some of the benefits you can hope to gain by using an O/RM:
1) Eliminate an entire layer and API from your application. No more stored procs that need to be written or maintained, or which have to be adhered to once they are in place due to frustrating regulations regarding the database.
2) Be able to query a conceptual model directly for your entities or collections of entities.
3) Have the option to define "fetching strategies" that allow you to fetch a root entity (or collection of roots), as well as related child entities and entity collections, with a single, efficient query to the database.
4) Be able to adapt quickly to changing business requirements without having to expend a significant amount of time refactoring a (possibly extensive) SP API. Even better, don't worry about not being able to adapt to changing business requirements because of rules regarding database changes.
5) In the case of LINQ to SQL, you can even perform ad-hoc queries against your conceptual model, and materialize custom or anonymous types that provide just the data you need, retrieved with the most optimal query. This again can be gained without the need to write and manage any SPs. This would solve the problem you have, where many applications only need fragments of information from entities, and you don't want to incurr the cost of retrieving a whole entity when its not needed. L2S offers exactly what you need to allow all 10 of your applications to retrieve what they need, when they need it, efficiently, from a conceptual entity model...without ever having to worry about writing or managing stored procs.
If you want more information about the benefits of O/R mapping, feel free to drop me a line. I don't know your DBA situation...if you have them, your probably in for a fight to have the right to use an O/RM (they generate parameterized dynamic SQL). SQL Server 2005 and 2008 offer a lot for applying query plan and performance tuning side-band, so don't let your DBA's tell you off-hand that there is no way to manage or optimize dynamic SQL if your using SQL Server (tell them to look up Plan Guides).
|
|
|
|
|
Interesting. I haven't had a look at OR/M tools for several years because the last time I tried them, they just weren't capable of doing the things I needed. That's the reason I normally generate the DAL and it's surroundings based on both db objects and business entities. But I understood from your post that the situation has changed a lot over the past few years?
Another thing I noticed is that you suggested Linq to SQL. I've understood that this is a "dying" product and will be replaced by Entity Framework. Correct me if I'm on the wrong track.
I think a bit differently about stored procedures. Basically it doesn't matter what kind of DAL or mapper you have on the client side but if the number of roundtrips to the database starts to rise if the data must be fetched in parts because of the logic (for example trees, networks, cumulative calculations etc), you may end up in problems. I think this is the point when procedures, views, functions etc show their best sides.
I'd be happy to hear you comments on those issues,
MIka
|
|
|
|
|
I suggested L2S for a few reasons. First, its really simple, and easy to get into. Second, if you use SQL Server, its amazingly efficient...the SQL generated is about as efficient as it gets, and often rivals or surpasses the performance of manually written queries. Third, its fully integrated into Microsofts development tools, which lets you learn how to use modern O/RM without having to learn low-level details like xml configuration formats and the like until you really need to. Fourth, L2S has complete support for .NET 3.5 LINQ. Your developers can query anything, from any data source...collections, xml, arrays, and the database (if you use L2S) with a single, unified query language. Thats huge. You may end up needing something more powerful to fulfill your mapping needs...NHibernate or Entity Framework would definitely cover those other scenarios if you needed them (NHibernate doesn't have a very rich LINQ scenario right now...its SQL generator doesn't generate nearly as efficient of queries as L2S...EF's query generator is something of an anomaly right now.)
(Yes, it will eventually be superceded by Entity Framework...but thats a ways off. EF v1.0 was kind of a dismal failure (IMO), and v2.0 needs a LOT of work. L2S won't just dissapear, either...nothing Microsoft has ever deprecated has actually ever been removed from the .NET framework...backwards compatability and all that. L2S in its current form will be around for a long time, and according to Microsoft, it will still be maintained and improved a bit, at least for a while.)
Regarding the efficiency when working with object graphs (trees, networks, cumulative calculations, etc.), that is where L2S truely shines. You would expect it to make several round trips to retrieve a root entity and several collections of child entities. L2S has a highly intelligent expression processor that translates your queries into the most efficient SQL possible. Retrieval of object graphs usually results in a single query to the database that returns a single flat result set containing everything for the entire graph (you can configure what is retrieved up front with DataLoadOptions on your DataContext, similar to a fetch strategy for NHibernate or a .Include() sequence in EF).
When you need to retrieve aggregated data (I read cumulative calculations as aggregations...correct me if I am wrong), L2S also shines. Even though ORM's are primarily meant to support entity retrieval and update...L2S has a unique ability to retrieve arbitrary data sets when you query your conceptual model. Assuming you have a conceptual model along the lines of:
Customer
Order
OrderLine
Product
You can query for customer information AND the aggregated count of how many orders they have in the last 6 months very easily with L2S:
var anonCustWithOrderCt = from c in db.Customers
where c.LastName = "Smith"
select new
{
c.CustomerID,
c.LastName,
c.FirstName,
OrdersInLastSixMonths = c.Orders.Count(o => o.OrderDate > DateTime.Now.AddMonths(-6))
}
The query generated will be very compact and efficient. COUNT in SQL Server is given special consideration by the query execution engine, so the following result is optimal:
SELECT [t0].[CustomerID], [t0].LastName, [t0].FirstName, (
SELECT COUNT(*)
FROM [Orders] AS [t1]
WHERE ([t1].[OrderDate] > @p0) AND ([t1].[CustomerID] = [t0].[CustomerID])
) AS [OrdersInLastSixMonths]
FROM [Customers] AS [t0]
WHERE [t0].LastName = 'Smith'
-- @p0: Input DateTime (Size = 0; Prec = 0; Scale = 0)
This example is extremely simple compared to the kinds of complex queries LINQ can handle. It fully supports grouping, joining, etc. The main difference is that it is based off of a conceptual model rather than the physical database model...which, if you have a complex model or a model with odd entities, the L2S query generator may have some trouble generating the most optimal query (it may be forced to add extra joins when you wouldn't expect them, etc.).
If you want to experiment, check out LinqPAD. Its a free tool that lets you quickly connect to any SQL Server database and start running LINQ queries against it in a few minutes.
modified on Thursday, January 8, 2009 11:03 AM
|
|
|
|
|