|
Marc,
First of all, thank you for the effort to write up the articles, I find them very interesting. I am still going over the articles repetitively to learn more and more about the framework, but in the meantime I have a couple of remarks (or, perhaps, questions, since it is likely I have not understood the framework exactly, yet) that I thought I would pick your brain with.
1.) Data Hub Lifespan.
The existence of a Data Hub implies (?) that some framework-related data, apart from "global" application data (environment, command line parameters, etc.), has a lifespan independent of any one workflow (otherwise, it would just be a part of the Common Data Representation load). I am having some trouble with thinking up examples of such workflow-independent, yet "persistent", data. For example, if some kind of GUI control needs to load a record set, the set should be acquired by a database-related Technology, converted into CDR, handed off to a GUI-related Technology, loaded, and destroyed (all part of a specific “LoadThisAndThat” workflow). Should, say, caching be required, a cache could be a separate Technology, built into the data acquisition workflow, to decide when and how it needs to be updated, refreshed, etc. Under what circumstances, generally speaking, should some "workflow-less" data, actually persist inside of the Data Hub?
2.) Data Hub Transparency.
The existence of a Data Hub as a Workflow Manager-independent Component implies (?) that a Data Hub is capable of running its own workflows, depending on the changes in its internal state. I am guessing that a Workflow Manager would have to be hooked into the Data Hub through some kind of Observer paradigm, for the framework to react to changes in Data Hub state. If my understanding is correct, wouldn't it be advantageous (component-model -wise) to define the Data Hub as a "passive" accumulation/aggregation/etc. Component that is driven by the Workflow Manager exclusively? I tend to see any framework-state altering process as a workflow itself, which actually has the capability (sometimes even responsibility) to launch sub-workflows. For example, to update some data in a Data Hub, a Workflow Manager-driven workflow might be required to authenticate through a Technology before proceeding (something, say, as simple as "don't modify anything while day-end batch processing is active"), etc.
3.) Data Hub Dependencies.
In the light of (2.) above, do you think it would be advantageous (component-model -wise) to eliminate Technologies' capability to communicate with a Data Hub directly, and, instead, enforce Data Hub "routing" through Workflow Manager workflows (and, effectively, through the Component Manager)?
4.) Workflow Manager Complexity.
Since the Workflow Manager part of the framework has not been introduced in detail yet, I have been somewhat "confused" about its duality (that is most likely my fault, not the framework's). It seems to me that it might be simpler to work out some kind of a Workflow Interpreter, and specialize it into two independent Components in the framework: an Event Manager and a Workflow Manager. The EM would be responsible for launching Event-, Technology-, or Component- initiated workflows (and acting as a Mediator), and the WM would be responsible purely for executing requested workflows. I mean Workflow Interpreter in the sense of "workflow mediation and execution definition language" interpreter. I mean Event Manager as a Workflow Interpreter, specialized to mediate and drive workflows, through a Workflow Manager, using “Common Event Representation”. I mean Workflow Manager as a Workflow Interpreter, specialized to drive Technologies and Components, through a Component Manager, using “Common Data Representation”.
What do you think? ... Or am I babbling here?
Gary
|
|
|
|
|
gniemcew wrote:
Under what circumstances, generally speaking, should some "workflow-less" data, actually persist inside of the Data Hub?
Configuration information might be one example. Data that a workflow needs to look at--information regarding establishing a connection to a database, timeout values, etc.
Also, I usually represent GUI data in a separate data store that's GUI presentation independent. A tree list is a great example, because I usually represent a tree's data in a flat representation rather than the native parent-child relationship, since it's the flat representation that is easier to persist to the database.
So, I have a flat representation that is manipulated by other GUI event--insert a leaf, move a child, delete a leaf, etc. There's a translator that figures out how to convert the flat representation to the tree view. But the flat representation is persisting in the data hub after the various workflows complete.
gniemcew wrote:
wouldn't it be advantageous (component-model -wise) to define the Data Hub as a "passive" accumulation/aggregation/etc. Component that is driven by the Workflow Manager exclusively?
I'm not sure.
gniemcew wrote:
do you think it would be advantageous (component-model -wise) to eliminate Technologies' capability to communicate with a Data Hub directly,
Hmmm. I think this is a viable alternative, but I'd be reluctant to implement it because I think it muddles the waters between data management and workflow management. I'm not sure the additional layer is of any advantage. Keep in mind that the data hub is supposed to be a repository for publically known data formats, and disconnected from workflows. I'm not sure if that addresses your question though.
gniemcew wrote:
It seems to me that it might be simpler to work out some kind of a Workflow Interpreter, and specialize it into two independent Components in the framework: an Event Manager and a Workflow Manager.
You are absolutely correct. In fact, I typically extend the implementation to also include a message manager and state manager. The workflow manager should be strictly a, as you said, "'workflow mediation and execution definition language' interpreter".
I never did continue after the last installment of these articles as I ended up reworking the whole concept into what became MyXaml (see sig).
You might be interested in looking at the documentation page[^] and especially the MVC Pattern[^] how workflows[^] and state machines[^] are defined in XML, and finally, this article on a declarative approach to an application framework.[^]
Thanks for the great questions!
Marc
MyXaml
Advanced Unit Testing
|
|
|
|
|
Marc
Your AAL articles make numerous references to the workflow manager. When will you post a design description of this component?
David Tasker
|
|
|
|
|
Soon. To be honest, I'm having a bit of trouble disentangling the concept of a workflow manager from the scripting/parsing side of things, so I've been holding back until I get the concept and design fleshed out.
Marc
Latest AAL Article
My blog
Join my forum!
|
|
|
|
|
|
I really enjoyed this article. I will head over to SourceForge to see how the project is coming.
In the mean time, I am curious to hear your comments on the book "Software Development on a Leash" (ISBN: 1893115917, http://www.amazon.com/exec/obidos/tg/detail/-/1893115917/102-5776258-1108912). I have mixed feelings about this book. From an architectural view, it seems to go too far overboard with the concept of horizontal interfaces (which, no doubt, are a good idea when correctly applied and are in alignment with all the principles you discuss in your article above).
I think (hope) your article (and project) could end up being a more rational approach to the goals expressed in "Software Development on a Leash".
I would like to hear more discussion comparing/contrasting the AAL and the vMach framework from "Software Development on a Leash". At the very least, a review of this book might lead to discussions that could benefit the development of the AAL.
|
|
|
|
|
The author describes separating the application from the architecture, and while I've done this before in software on several levels, I never considered removing the application completely from the compiled software and driving application behavior from what the author calls "external structural and behavioral metadata". Using the same binary program to support multiple applications without recompiling a single line of code? Perhaps without recompiling on each and every application release, which makes perfect sense. However, he avoids rebuild for complex screen-level changes, too (even cosmetics). Not too shabby.
Sounds like what I've achieved with the MFC version, but my approach is different--I don't use metadata and object builders. I must confess I haven't read the book, so my response here is limited to reading the reviews. I'll have to get a copy of it though, it looks interesting.
Regarding your comment in the 4th article--several people so far have pointed me to the Sharp Develop project. I really need to contact these people!
Thanks for the feedback and the references--it's really helpful to network with other people.
As to the Source Forge site, it's an area I've completely ignored after the initial burn of reading through all the Unix-style documentation and finally managing to get something up and running on CVS. If you'll pardon the metaphor, it's a satellite in a LEO that's awaiting the next burn to the transfer orbit.
I do have an MSI installation with the latest and greatest though, that I could email you, if you were interested. Know anyone that can help me set up the CVS stuff on Source Forge?
Thanks!
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka Microsoft deliberately adds arbitrary layers of complexity to make it difficult to deliver Windows features on non-Windows platforms--Microsoft's "Halloween files"
|
|
|
|
|
Marc,
I posted my initial comment after reading just the first installment of this series. (I also read a couple of your other related articles -- was one the 4th of this series? I thought there were just 3 in this series.) Anyway, I have now read the 2nd installment. These are great articles. What interests me are 1) the architectural approach, and 2) the implementation decisions. It's all very interesting and relevant to modern application development -- I can't wait to read the 3rd installment later tonight. BTW, I personally prefer your writing style over that of the author of "Software Development on a Leash." If you read the book, I'm sure you'll see it contains a lot of hype. But anything about separating the application from the architecture is interesting reading for me.
Regarding SharpDevelop, I think you should open a dialog with them. Their Add-In tree is pretty impressive.
I have to make some architectural decisions for an upcoming application. I'm 100% in favor of the AAL and component-oriented concepts, but I am concerned about performance in C#. The SharpDevelop UI is quite a bit slower than common Windows apps (Excel, Word, etc.) and the other apps I've written in C#. I love the flexibility of code maintenance benefits, but I personally want my app to be more responsive. This is a real dilemma...
|
|
|
|
|
MtnBiknGuy wrote:
I thought there were just 3 in this series
Here's the 4th in the series:
http://www.codeproject.com/cs/menu/AAL-4.asp[^]
It's hiding under a different section. It references the other three. You've seen in already though, as one of your messages was posted on it.
MtnBiknGuy wrote:
but I am concerned about performance in C#
I agree. The next installment illustrates using XML to specify GUI's and has a demonstration app that coordinates some data with the GUI's. One of the GUI controls is a tree view directory structure, which I lifted from someone else's code on CP. There's all this Shell stuff that gets the tree structure and icons, and it's slow as molasses. I haven't looked yet as to what can be optimized in using the .NET stuff--everything the author wrote is basically going through a managed interface to the shell32.dll
Well, if you want a huge code base, I'm happy to send you the AAL implementation in C++/MFC--it's a lot more developed (having been used for about 5 years now and still actively using it), but of course I've made some different design decisions in the C#/.NET version, such as using XML for a lot of the specification stuff.
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka Microsoft deliberately adds arbitrary layers of complexity to make it difficult to deliver Windows features on non-Windows platforms--Microsoft's "Halloween files"
|
|
|
|
|
Marc, I would certainly like to experiment with the C++/MFC version of AAL. Looking at it will be educational and informative for me. Thanks!
|
|
|
|
|
I've worked alongside the author of SOAL on two major projects and it does pay off if you approach it in context of .NET and true inheritance. The SOAL approach was born in the original VS 6 environment but the .NET form gives a lot more power to it. For example, I can define a SOALtreeview
as a simple user control and have it inherit the MS Treeview, the Infragistics Treeview or the COmponentOne treeView, as examples. This creates an extended user control and allows me to
provide additional "harnessing" of the underpinning control's API within the custom control's implementation (the adapter, so to speak). If I expose common functions such as Load, Fill, Layout, etc, this allows an implementor to use the common functions rather than the specific API, and allows for the interchange/ swap out the components without affecting the behavioral code external to the control. It also provides a means to define interactive relationships such as cross-component communication, observation and other abstracted behaviors because the adapters can implicitly know how to talk to one another and bind the otherwise disparate controls into a common framework.
In another example, I can create a typed data set that is actually an extended dataset to the
ADO.NET dataset. This extended dataset inherits the ADO.NET dataset, then further harnesses the
various classes commonly interacting with the dataset. In our case, we provided a dataset.Fill that mimics the xxxDataAdapter.Fill method. However, it has the capacity to automatically determine from the connection if it needs to use an OledbDataAdapter or a
SQLDataAdapter and calls the appropriate Fill() function on the adapter. Why do this? It eliminates the application-level's need to instantiate multiple resources to perform highly repetitive operations, and it consolidates the repetitive operations into a one-stop shop. I get tired
of having to present parameters to a command object to support a stored proc, when I could call the extended dataset's Fill method and it determines from metadata what the parameters should be (the Fill has a paramarray for the parms) and dresses up the command object, executes and returns the result. This is an example of consolidating structural and behavioral patterns to eliminate the entanglements at the highest application level.
It is impossible to describe how agile this approach makes our application logic, and how solid it makes the underpinning capability logic - more so each time we use it. I think the SOAL approach got obfuscated by the author's having to support the original VS 6 version and .NET in the examples, since the book was published on the boundary of .NET's initial release. I would hope that he would consider a more concise second edition to expand on .NET's capabilities
|
|
|
|
|
It's good to hear about your experiences. It is unfortunate, IMHO, that SOAL is not a better written book. The ideas and concepts are exciting. But I find I learn more about these concepts from sources such as Marc's articles and a couple good books on software architecture (that give a much deeper and simultaneously better balanced presentation of the concepts than SOAL does). Anyway, the SOAL concepts are interesting and it's good to hear your feedback.
|
|
|
|
|
Marc:
Back in the MS-DOS times, I was trying to "build" programs without programming, directly from the model I designed. There was no OO (and I needed it a lot).
Recently, I found some technologies very much related:
- UML diagrams
- e-Workflow
- model-driven-applications (MDA)
- persistence frameworks
- software components (like COM+)
I would like to solve most of my programming tasks using a visual editor, like in any good WorkFlow diagram.
I don't want to worry with the database design, that is the task for a persistence framework, able to receive dynamic requests for data and metadata updates.
I will love to use inheritance and to use classes for modelling my domain. So, my objects or documents will need a description similar to UML classes and their relations.
In addition, if my application will run like a MDA, I will have to define actors, permissions, scenes, actions, some concepts which allows the core engine to make my design a real application.
So, I found very interesting your articles, because the use of XML, their transformations and visualizations, can complement the idea of "no programming" (or programming in a simpler way).
Today I was surfing the net and found something related to this topic, http://ki.cs.tu-berlin.de/~stauch/Diplom/ (Implementation of Workspaces). I found it supports the idea of a "workflow programming paradigm".
Thanks,
David Izada Rodríguez
Santiago
Chile
|
|
|
|
|
OOPS, where is the simplicity ?
In the olden days of COBOL programming, we had only 2 layers (in fact we didn’t have layers at all, because we never thought of them). We had libraries and programs which called these library based functions. That was the extent of code re-use. There were no patterns, no objects, no Relational databases.
We had big systems running even then, as is proved by the main-frame COBOL systems which survive even today in large corporations. There were no lost pointers, memory drains, and the cost to the customer was limited to the Unix box system, and the consultants fees. We nowadays have to pay for the development tools used by the consultant (or higher fees in lieu of it), a RDBMS package, and expensive maintenance contracts because the programmer who debugs a modern system has to know the ‘architecture’ used by the developer. (When I use the word ‘architecture’ to some of my colleagues, they look at me is if I have traded professions, but in the golden olden days, people who built stylish buildings did architecture – programmers designed system). There are now layers upon layers of software, placing the user on top of a suite of software layers , each fluffy and rich as a good pillow.
I sometimes wonder, what is the leap-frog that we have achieved from the customer’s perspective ? The end-user of the IT system, who is a functional manager wants information either in a report, or on screen and he continues to get it the same way. The screen may be a little more colorful and attractive, but when I am fighting to keep my inventory under control , and my profits high enough to pay dividends, color of screen is my least priority. I want an IT system, which can be a black box, which just makes things easier for me , because it is intelligent and fast and maintainable. Period. I have a strange feeling that the inventory and financial systems we developed in the eighties were as intelligent, fast and maintainable as the systems developed currently, if not more so.
Though I agree that adding a lot of computer science concepts like OO, design patterns , UML and use cases make the process more scientific, I feel the complexity added in terms of layering and multiple approaches has made software maintenance an esoteric art form. Tell you what , if Maruti India (which is a Suzuki version in India), can claim to have a service station everywhere, even on a remote mountain top, it is because, the mechanic can go under the hood without any lights on and tell which component is where without even seeing it, and can even make out it’s state by it’s sound. It was achieved by simple standardization. If I was let loose today in a OO-centric, n-tiered system and asked to fix a bug, I would delve into the documentation and code to see what was the design approach before I even set a finger on the code. If , in my olden days, I was let loose on a system to debug, all I had to do was to figure which directory had the latest version of the software and data – I was then almost as comfortable as a Maruti mechanic .
Software is complex by nature . No two pieces of software is same – just like no two grains of sand are same. Instead of making it more simpler, why do we make it more complex by adding so much of technology to it ?
|
|
|
|
|
Umm... have you lately tried to sell some software that was written back in your old Cobol days?
Users have become much more demanding, and not just in the amount of colors.
"Back in the old days" you could say "this application requires a 24 needle Epson printer, does nor run on DR-DOS and you'll spend about an hour fiddling with config.sys until everything runs fine." Try this today
Systems now need to interface components written by someone else.
Software is now rarely written by a single person - written by a team - which forces a qualitative jump in complexity on us.
I understand your point, and yes, instead of adding complexity we sometimes should use the simpler approach - but there are reasons for the complexity!
Italian is a beautiful language. amare means to love, and amara bitter.
sighist | Agile Programming | doxygen
|
|
|
|
|
You've introduced some interesting ideas here. I'd love to see a series of articles on this stuff.
Michael
Fat bottomed girls
You make the rockin' world go round -- Queen
|
|
|
|
|
I'd love to see a series of articles on this stuff.
That's definitely in the works. I just finished the text, code, and drawings for Part II. I'll probably post it tomorrow some time.
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka
|
|
|
|
|
I think there are some good ideas here, but I do not see the approach as outside object oriented programming. This is just a specific design concept which would and should be inmplemented as classes. I have used some of the same approaches and implemented them as classes in an object oriented program.
Ron Marshall
rmarshall@tritechsoft.com
|
|
|
|
|
but I do not see the approach as outside object oriented programming.
Hi Ron,
You are correct--the approach is NOT outside OOP. I hadn't intended to imply that it should be--merely that OOP is mis-applied in many cases.
This is just a specific design concept which would and should be inmplemented as classes.
Yes, that is absolutely correct. However, I seem to get a lot of blank stares when I try to explain this concept to most people. Mind you, I fully think that it is my inability to get the concept across. Of course, what I've noticed is that once people start using this design concept, they can't explain it either! (but they really like it)
I have used some of the same approaches and implemented them as classes in an object oriented program.
Excellent!
Thank you for the feedback. I'd like to hear more from you as I post more of the guts of this thing.
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka
|
|
|
|
|
Marc, I will try to elaborate a little. I have used a concept I call Info classes. A group of Info classes would be somewhat similar to your data hub. The concept of the Info class originated by separating all the external data of a class into a separate class. This seems like a natural structure and I thought of it as an organic cell analogy. People do a somewhat similar thing by feeding a filled in structure to a class. And some would argue that an info class should be a structure because all the data would most likely be public. However the Info class initalizes all the data to reasonable values, has error checking, data conversion, and persistance handling. The process class is set by a copy or pointer to an Info class. Thus several process classes can look at the same data at the same time. Several process classes can use the same Info class. Info classes can use single or multiple inheritance. A process class only needs to use a subset of an Info class. This concept has worked well in computer aided design applications where many processes use subsets of the same data.
I have found that when writing Multi-Document Interface MFC applications it is best to put a minimum amount of code in the view and document classes. This minimizes coupling and maximizes reuse. Most of the code is handled by what I call manager classes, some what similar to the process manager. The manager classes are members of the application class which I suppose has a function similar to the component manager.
Ron
Ron
Ron Marshall
rmarshall@tritechsoft.com
|
|
|
|
|
Hi Ron,
Thanks for the elaboration.
This seems like a natural structure and I thought of it as an organic cell analogy.
It seems like a lot of good programming practices can be derived from looking at processes in nature. Funny, also, because I wrote an article called Organic Programming:
http://www.codeproject.com/cpp/Organic_Programming.asp[^]
Thus several process classes can look at the same data at the same time.
Very nice! I'm curious about a few things:
1. Do you have to deal with processes overwriting data in the Info class, and if so do you have a built-in semaphore mechanism?
2. Do you allow multithreaded processes, where several processes are looking at the same data simultaneously?
3. Do you encorporate workflows? I can see an architecture where one process creates an Info class that another process or processes consume, creating new Info classes, etc.
4. Do you get into dependency problems with the Info class. Say, process 1 uses information that process 2 generates, but process 2 is waiting on info from process 1?
Anyways, it sounds like a good approach. You should write an article about it!
BTW-interesting company--specializing in tax software for insurance companies. Wow. What's your roll there?
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka
|
|
|
|
|
|
1. Do you have to deal with processes overwriting data in the Info class, and if so do you have a built-in semaphore mechanism?
2. Do you allow multithreaded processes, where several processes are looking at the same data simultaneously?
3. Do you encorporate workflows? I can see an architecture where one process creates an Info class that another process or processes consume, creating new Info classes, etc.
4. Do you get into dependency problems with the Info class. Say, process 1 uses ...(continued)
I was pointing out that the Info class gives you alot of choices. Normally the process object would get a copy of an Info object. A process class could be designed to work with an Info object copy or an Info object pointer. You would want to use a pointer to maintain concurrency in clearly defined situations. I would try to advoid the dependency problems you can get in by using pointers to Info objects. In a practice I have just mostly used copies of Info object. In a multithreaded application I used an Info object to communicate data between A GUI and worker thread. Info object updates were protected by a critical section. In a grapical compter aided design program copying is a workflow issue and has to be managed carefully. I used the manager class to do this. The manager class is a way to split up process expertise.
Futhermore you can have a master Info class that presents more than one interface Info class to different process classes. The single process class only has to know its Info interface class, but all the data conversion can be concetrated in the master info class.
I found the article on the Linda language interesting. I see some simularities, although Linda is a much more abstract approach.
http://iamwww.unibe.ch/~scg/Research/ComponentModels/linda.html
I am currently working on a C++ COM used by Visual Basic apps. The COM provides the GUI and database interface.
Ron Marshall
rmarshall@tritechsoft.com
|
|
|
|
|
Hi Ron,
I very much like your concept of an Info object and a process class. If you don't mind, I'd probably like to encorporate this concept in some form in the AAL, giving due credit of course to you for bringing your design to my attention! Would that be OK with you?
Marc
Help! I'm an AI running around in someone's f*cked up universe simulator. Sensitivity and ethnic diversity means celebrating difference, not hiding from it. - Christian Graus Every line of code is a liability - Taka Muraoka
|
|
|
|
|
Marc, that would be fine with me.
Ron
Ron Marshall
rmarshall@tritechsoft.com
|
|
|
|
|