DataSet with multiple tables especially with identical columns and more specifically in a situation of parent child relationship poses a great challenge. And this is especially true if you are using Enterprise Library. A good example is a
DataSet that has Orders and OrderDetails tables. The table Orders has a primary key which is auto generated. While updating multiple tables the Enterprise Library June edition first enables you to get the
DataAdapter that can accept a table as parameter so that you have all the granularity.
Now let's consider a scenario. Suppose we have a purchase order management system and we want our suppliers to supply items in installments. This will result in having many consignments for a given order. Now after normalization we find that our consignment has two tables, one containing some general details and the other containing the details for the items consigned.
This screen shot below shows a part of the schema of our database that will hold our data:
Figure 1. Database schema
Now let's see the stored procedures we have for inserting
Consignments (Parent) and
CREATE PROCEDURE [dbo].[sp_ConsignmentsInsert] (
SET NOCOUNT ON
INSERT INTO [Consignments] (
from [Consignments] where
Figure 2. Stored procedure for inserting
ConsignedItems we have the following procedure for inserting data:
CREATE PROCEDURE [dbo].[sp_ConsignedItemsInsert] (
SET NOCOUNT ON
INSERT INTO [ConsignedItems] (
from [ConsignedItems] where
[ConsignmentNo] = @ConsignmentNo AND [ItemCode] = @ItemCode
Figure 3. Stored procedure for inserting
First of all let's see some of the features of the code snippets included in this article.
ConsignedItemsDB are table gateways that pass the table and a transaction. This is important if you want all the changes in a transaction.
- I avoided using Enterprise Services for managing transactions and hence had to include transaction objects in my business object though it is not a good practice. I think ADO.NET 2 has introduced something better. However I find it cumbersome to use Enterprise Services. (Sorry for this).
ConsignedItems are tables in an offline
DataSet. My business object inherits from a typed
ConsignmentsDataSet so that I have direct access to the tables and access to call methods such as
Merge() etc. I have included the source files for you to go through and see what I mean. Sorry that I couldn't include a demo but I hope this information is sufficient. Remember to add/update the parent first and then the child and delete the child first and then the parent. Here
Consignments is the parent table of
ConsignedItems. Because we don't want to update the
DataSet directly as this results in loss of all other un-submitted changes in the
DataSet ,we decided to fetch those changes separately into another table by using the method
<Table>.GetChanges(). The row state determines which rows are to be fetched from the table.
Now here we start: Fetch changes from the parent table (Inserts and Updates):
Figure 4. Fetching changes from the
SaveChanges() as shown in figure below that helps us to submit changes in the
DataSet to the database for persistence:
public void SaveChanges()
Figure 5. The
SaveChanges method in the
ConsignmentsBO, a business object.
Note: We have used transactions because we want every
DELETE to succeed in whole or fail in whole, not partially. That is the case when there is no identity column in the
Consignments table as a primary key. Then if that is not the case what do we do? Here we have got to manage the issue of identity column as a primary key.
It is a very simple idea but seems promising, although some one may find it shocking as it is not a built in solution in the .NET Framework. The issue in the above scenario occurs when we want to merge the original
DataSet and the rows that have been updated. The simplest way is to introduce another column in the
DataTable that has an identity column, say RowID. I often ask myself why Microsoft has not introduced some thing like this that would be of great use to developers and even in the SQL Server this would be of great benefit. This should also be identical, hence making every row have a unique row ID.
Now after fetching the rows that were added from the
DataSet and passing them to the
DataAdapter for propagating the changes to the database and re-fetching our newly generated identity primary key as our stored procedures are geared for (use of
SCOPE_IDENTITY), we need an efficient way to reconcile it to the original table in the
DataSet. Here is where the RowID plays a great role by using it to compare the updated row and the row in the
Here is a helper to do that:
public class ReconcileHelper
public static void Reconcile(DataTable source,
DataTable updated,string commonKey)
foreach (DataRow ur in updated.Rows)
commonKey + "=" + ur[commonKey].ToString();
if(dr!=null && dr.Length>0)
Figure 6. Full implementation of the
ReconcileHelper class that helps us in reconciling identity primary keys after re-fetch from the database (Stored procedures Figure 3 and 4).
source is the table in the
DataSet that holds the rows that were added by the user.
updated is the table holding only the added rows that were passed to the data adapter for propagation into the database.
commonKey is the name of a key that is used instead of the primary key to identify rows because the identity primary key is now different for the two row sets (original
DataSet row and the updated row).
OK, now we come to the actual updating method in the table gateway:
public static void Update(DataTable table,IDbTransaction trans)
Database myDatabase = DatabaseFactory.CreateDatabase();
DBCommandWrapper insertCommand =
DBCommandWrapper updateCommand =
DBCommandWrapper deleteCommand =
IDbDataAdapter dbAdapter =
Figure 7. The
Update method in the TableGateway.
I find this self explanatory. If you don't understand the snippets please go through the collection of classes in the download that will help you explore the whole world. In the end, I would like to say I am very sorry for not being able to include every thing that you would expect, e.g. all the stored procedures mentioned in this article. I thought they were unnecessary and wanted to make the article as short as possible. Lastly, I welcome your contribution to modify this for a better solution.