I had form a system integration between system A and system B. System A would sync a same record to system B everytime a new record was inserted into a local Database of system A via Web Services. When there is some exception or failure to cause the web services disconnected, system A does not able to sync record to system B because the web services to be consume from system B cannot be reach. Is there any best practice to cater scenario in real life like this ? The new record created in system A cannot duplicate to system B. What if I schedule a job to check any failure record sync to system B and once the web services was back to online and trigger a patching operation to patch the record in system A back to system B?
The new record created in system A cannot duplicate to system B
If that is a hard requirement then you cannot create the record on A unless B is available. Basically you would create it on B first and only then create it on A.
If however there are ways to insure that during creation of the record on B that you could, post process, make B unique then you could proceed with a queuing strategy which would involve the following
- Determine an algorithmic approach to making B unique. This might or might not include some post processing manual intervention.
- Create a data store on A suitable for storing the data needed to create the record on B.
- Create a timer that periodically checks for queued records, if any exist then it attempts to create the record on B. If it succeeds it removes the record from A. If it fails (B is down) then it waits til the next time the timer fires.
You can have a field in system A to update whether the record is synced with system B or not. Something like,
SyncStatus - Store the status here
IsSynced - True or False
The above field should be updated, at last. That is once the record in System A is synced to system B, then update the above field.
Let's talk little advance, You always can roll-back update process, if the web service goes down in middle of the process. That is, hold the sync/insert process. Check the, Entity Framework's - Rollback functionality.
I would use whatever "high availability" solutions are available for your database server - attempting to write your own is going to be a major undertaking. (If you can't afford any downtime at all then you need to look at clustering.)
I thought the purpose of entity framework was to discharge his duty to write queries (whether in SQL or LINQ). I worked on the framework for manipulating entities without worrying about queries. I think this is the case of NHibernate. This is not the case with the Entity Framework.
Okay, I have a need for a pretty generic Directory Monitor utility.
It will run as a GUI and as a Service. The GUI is for configuring and testing.
The service will run on the active configured items (multiple).
I am stuck on a design implementation issue. It is NOT that important to me,
and that creates the problem. The usages of this "DirMon" are typically to
detect a file has been updated/saved, and to force it into source control.
Furthermore, we use a syntax that normally does not care about the individual
file, it handles the whole directory structure.
Think: Management Folders Sync'ed with other managers through version control.
I face 2 challenges. The first is that I want this to be "better" than that,
because I am so close to a VERY USEFUL end user type tool. The second is that
the process ReadDirectoryChanges() and GetQueuedCompletionStatus() have 3 issues.
1) When overwriting an existing file, I get 2 notifications for that file
2) They can (and will) become lossy if overwhelmed with changes, especially if I
cannot keep up with them.
3) I cannot have a new process start while an existing one is finishing
My quick hack was to simply add a timer in between the notification and me firing
the event. This is far less than a great idea.
After thinking about it, I realized this is a real problem. I am probably missing a
well accepted Design Pattern...
So, that is my question... How would you design this last piece, so that as the
various events fire, that I could keep track of various changes and make sure that
they were dealt with?
[I am willing to add a Flag for the user to choose if they NEED individual or summary
events, so I can know. If summary, I will use my current approach where I just keep
resetting the timer until things calm down. But if they need/want individual events,
what to do?]
Monitoring a folder with or without its sub-folders is a pretty common task. Many developers experienced the oddities of System.IO.FileSystemWatcher, and you may find some tips & tricks or articles on its mutliple or missing events. Or did you write your own FileSystemWatcher? Note that a to narrow schedule of the polling task may cost to much resources.
add a Flag for the user to choose
Are you sure any end user - who is not a developer, you told us - is interested in that? I.e. do you expect them to understand your underlying technical solution? I doubt that. They just need to know if there are files to by synced. By the way, why don't you do that automatically?
Okay, I am doing this in Delphi (not .NET), so no System.IO... stuff..
Even if I end up configuring that flag for them myself, setting it up on their machine, I am good with it.
Bernhard Hiller wrote:
They just need to know if there are files to by synced. By the way, why don't you do that automatically?
That is what I am building for them. It will be configured once, and run as a service. Whenever they save, it will do the appropriate sync. Which, for different clients could be Mercurial, SVN, etc. I am NOT giving them a tool to see what has changed (that is useless, they know what changed, in general). They just forget to Jump into something else and commit. In some cases, the action is triggered by generated reports to a folder.
Since 99% of the use cases don't care about specific files, I will simply ignore that case. I will let it run, and see if a Version 2.0 is needed to handle more.
I think, sometimes, I get in my own way, trying to cover EVERY possible use case...
They can (and will) become lossy if overwhelmed with changes
People are updating these files and each person has their own files - why would you lose a notification?
Kirk 10389821 wrote:
that I could keep track of various changes
When I edit I like to save intermittently just in case the power goes out. How are you going to deal with that behavior especially if the file is large?
Kirk 10389821 wrote:
Management Folders Sync'ed with other managers through version control.
And your comments all seem to be about putting the files into the service, but what about getting them out?
As a user it is going to annoy me to no end if my system starts 'pausing' every 5 seconds every time I stop typing for just a millisecond. And it probably absolutely useless to have only piece of some update added to version control.
So a timer seems like a much better solution. Every 5 minutes see if a file has changed more than 5 minutes ago but less than 10 minutes ago. Then sync. And at the same time check for updates to existing documents.
Business layer can be placed in two locations 1. On client side and 2. On server side.
Client side business layer will does all client side logic before sending to service (think the application has both Client and Service) and Server side business layer will deal with logic that is after retrieving data from db or before saving to db.
Layer is nothing but a code/project to do certain actions. Also remember a Business Layer must not have any UI related things. If it has, then it is not a business layer. We have to move those UI related things to some other place most probably to Presentation Layer.
Also on the other hand, if we have only one Client which deals with DB, then there will be only one BL (a business layer) which goes all logic in here.
In the 1970-80s the 4 bit TMS1000 was used in lots of games, Speak and Spell, and consumer products, microwaves etc, including a model railroad controller called the Hornby Zero One.
Would like to upgrade slightly the code for the controller by adding some more locomotive addresses. Has 16, need to go to say 64.
Apparently there is a way to download the existing code thru a test port, does anyone know of this ?
What I would like to have done is to download the code, make amendment for the number of addresses and then upload onto a modern 8-bit chip.
Is this reasonably simple !! Is there someone out there who has done this?
Off-hand, it doesn't sound easy, and it might help if you could provide links to what information you have found so far.
E.g. "Apparently there is a way to download the existing code thru a test port" where did you find that information?
You'll never get very far if all you do is follow instructions.
There's various implementations, and yes, some will be compatible with .NET. Whether ASTA's implementation works with .NET 4 is something that you should ask them. From the Google result it appears that there is a COM-version.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
Last Visit: 31-Dec-99 18:00 Last Update: 30-Aug-14 2:58