Click here to Skip to main content
15,878,814 members
Articles / Web Development / ASP.NET
Article

ADO.NET 2.0 Features

Rate me:
Please Sign up or sign in to vote.
3.33/5 (6 votes)
24 Nov 20076 min read 52.5K   30   1
This article summarises the new and updated features of ADO.NET 2.0, which ships with .NET Framework 2.0.

Introduction

This article summarizes the new and updated features of ADO.NET 2.0, which ships with .NET Framework 2.0. In my following articles, I will discuss these features in more detail with sample examples.

Given below is a list of new and updated additions to ADO.NET.

1. Bulk Copy Operation

Introduction

Bulk copying of data from one data source to another data source is a new feature added to ADO.NET 2.0. Bulk copy classes provide the fastest way to transfer a set of data from one source to another.

Each ADO.NET data provider provides bulk copy classes. For example, in SQL.NET data provider, the bulk copy operation is handled by the SqlBulkCopy class, which is described in Figure 1. As you can see from Figure 1, data from a data source can be copied to one of the four types - DataReader, DataSet, DataTable, or XML.

Screenshot - BulkCopyIm1.gif
Figure 1. Bulk Copy operation in ADO.NET 2.0.

Using bulk copy operation, you can transfer data between two tables on the same SQL Server, between two different SQL Servers, or even between two different types of database servers.

Filling Data from the Source

The first step in copying bulk data from a data source to another is to fill data from the source database. This source data can be filled in a DataSet, DataTable, or a DataReader.

C#
// Select data from Products table
cmd = new SqlCommand("SELECT * FROM Products", source);
// Execute reader
SqlDataReader reader = cmd.ExecuteReader();

Creating SqlBulkCopy Object

In ADO.NET 2.0, each data provider has a bulk copy operations class, which provides bulk copy related functionality. For example, SQL data provider has a SqlBulkCopy class.

SqlBulkCopy class constructor takes a connection string or SqlConnection object as the first parameter, which defines the destination data source. After creating the object, you need to set the DestinationTableName property to the table, which you want to copy data to.

C#
// Create SqlBulkCopy
SqlBulkCopy bulkData = new SqlBulkCopy(destination);
// Set destination table name
bulkData.DestinationTableName = "BulkDataTable";

Copying Data to the Destination

The SqlBulkCopy class provides the WriteToServer method which is used to write data from a DataReader, DataSet, or DataTable to the destination data source.

C#
bulkData.WriteToServer(reader);

In this code, I fill data in a DataReader object from the source data source. You can even fill data in a DataSet and pass DataSet as the input parameter of the WriteToServer method. You can also pass an XML object or fill data in a DataSet from an XML document.

Closing SqlBulkCopy Object

The Close method of SqlBulkCopy closes the bulk copy operation.

C#
bulkData.Close();

Complete Source Code

The following table lists the complete source code.

C#
// Create source connection
SqlConnection source = new SqlConnection(connectionString);
// Create destination connection
SqlConnection destination = new SqlConnection(connectionString);

// Clean up destination table. Your destination database must have the
// table with schema which you are copying data to.
// Before executing this code, you must create a table BulkDataTable
// in your database where you are trying to copy data to.

SqlCommand cmd = new SqlCommand("DELETE FROM BulkDataTable", destination);
// Open source and destination connections.
source.Open();
destination.Open();
cmd.ExecuteNonQuery();
// Select data from Products table
cmd = new SqlCommand("SELECT * FROM Products", source);
// Execute reader
SqlDataReader reader = cmd.ExecuteReader();
// Create SqlBulkCopy
SqlBulkCopy bulkData = new SqlBulkCopy(destination);
// Set destination table name
bulkData.DestinationTableName = "BulkDataTable";
// Write data
bulkData.WriteToServer(reader);
// Close objects
bulkData.Close();
destination.Close();
source.Close();

Note

Before executing this code, make sure your database has a table named BulkDataTable with the same schema as the Products table.

2. Batch Update

Batch update can provide a huge improvement in performance by making just one round trip to the server for multiple batch updates, instead of several trips if the database server supports the batch update feature. The UpdateBatchSize property provides the number of rows to be updated in a batch. This value can be set up to the limit of decimal.

Batch Updates in ADO.NET 2.0 for Improved Performance

When you updated a database using the DataAdapter in .NET 1.1, each command was sent to the database one at a time. This caused a lot of roundtrips to the database.

ADO.NET 2.0 has introduced the concept of Batch Updates, which allows you to designate the number of commands sent to the database at a given time. If used correctly, this can increase the performance of your data access layer by reducing the number of roundtrips to the database.

DataAdapter.UpdateBatchSize Property

The DataAdapter has an UpdateBatchSize property that allows you to set the number of commands that will be sent to the database with each request.

  • UpdateBatchSize = 1, disables batch updates
  • UpdateBatchSize = X where X > 1, sends x statements to the database at a time
  • UpdateBatchSize = 0, sends the maximum number of statements at a time allowed by the server

Command.UpdatedRowSource Property

When using batch mode, the UpdatedRowSource property of the command can only be set to either UpdatedRowSource.None or UpdatedRowSource.OutputParameters.

Batch Updates Tutorial Using Northwind

You can test out batch updates on the Northwind Database by simulating an update to the Categories Table.

First, get the data from the Categories Table. The code below gets the information and places it in an untyped DataSet:

C#
SqlConnection connection = new SqlConnection("...");
SqlDataAdapter adapter = 
    new SqlDataAdapter("SELECT * FROM Categories",connection);
DataSet ds = new DataSet();
adapter.Fill(ds);

Simulate modification of the CategoryName of each category so there is something to update:

C#
foreach (DataRow dr in ds.Tables[0].Rows)
{
 string categoryName = dr["CategoryName"].ToString();
    dr["CategoryName"] = categoryName;
}

Construct an update command for the SqlDataAdapter to update the data and assign it to the adapter's UpdateCommand property:

C#
SqlCommand command = new SqlCommand();
command.CommandText = "Update Categories Set CategoryName = @CategoryName
WHERE CategoryID = @CategoryID";
command.Parameters.Add(new SqlParameter ("@CategoryID",
                 SqlDbType.Int)).SourceColumn = "CategoryID";

command.Parameters.Add(new SqlParameter
      ("@CategoryName", SqlDbType.NVarChar, 15)).SourceColumn = "CategoryName";
adapter.UpdateCommand = command;

Set the UpdatedBatchSize and UpdatedRowSource equal to the proper values. In the case of the Categories Table, there are only eight records in it and we have changed them all. I will set the UpdatedBatchSize to 2 for the sake of testing.

C#
adapter.UpdateBatchSize = 2;
command.UpdatedRowSource = UpdateRowSource.None;

Execute the Update Process

C#
adapter.Update(ds);

Hooking into the DataRowUpdating and DataRowUpdated Events of the DataAdapter

I hook into the DataRowUpdating and DataRowUpdated events of the DataAdapter as below:

C#
adapter.RowUpdating +=
    new SqlRowUpdatingEventHandler(adapter_RowUpdating);
adapter.RowUpdated +=
    new SqlRowUpdatedEventHandler(adapter_RowUpdated);
private void adapter_RowUpdated(object sender, SqlRowUpdatedEventArgs e)
{
    _countUpdated++;
}

void adapter_RowUpdating(object sender, SqlRowUpdatingEventArgs e)
{
    _countUpdating++;
}

Without Batch Updates, both events will fire eight times, one for each row being updated.

However, with Batch Updates, RowUpdated will only be called once for each batch update (8 rows / 2 updates per batch = 4 times). RowUpdating will be called the usual eight times.

Conclusion

Batch updates can improve the performance of your data access layer by reducing the number of roundtrips to the database.

3. Data Paging

Now, command object has a new execute method called ExecutePageReader. This method takes three parameters - CommandBehavior, startIndex, and pageSize. So if you want to get rows from 101 - 200, you can simply call this method with start index as 101 and page size as 100.

4. Connection Details

Now you can get more details about a connection by setting Connection's StatisticsEnabled property to True. The Connection object provides two new methods - RetrieveStatistics and ResetStatistics. The RetrieveStatistics method returns a HashTable object filled with the information about the connection such as data transferred, user details, cursor details, buffer information and transactions.

5. DataSet.RemotingFormat Property

When DataSet.RemotingFormat is set to binary, the DataSet is serialized in binary format instead of XML tagged format, which improves the performance of serialization and deserialization operations significantly.

6. DataTable's Load and Save Methods

In previous version of ADO.NET, only DataSet had Load and Save methods. The Load method can load data from objects such as XML into a DataSet object and Save method saves the data to a persistent media. Now DataTable also supports these two methods.

You can also load a DataReader object into a DataTable by using the Load method.

7. New Data Controls

In Toolbox, you will see these new controls - DataGridView, DataConnector, and DataNavigator. (See Figure 1.) Now using these controls, you can provide navigation (paging) support to the data in data bound controls.

Screenshot - AdoNet20Img1.jpg
Figure 1. Data bound controls.

8. DbProvidersFactories Class

This class provides a list of available data providers on a machine. You can use this class and its members to find out the best suited data provider for your database when writing a database independent application.

9. Customized Data Provider

By providing the factory classes, now ADO.NET extends its support to custom data provider. Now you don't have to write a data provider dependant code. You use the base classes of data provider and let the connection string do the trick for you.

10. DataReader's New Execute Methods

Now command object supports more execute methods. Besides the old ExecuteNonQuery, ExecuteReader, ExecuteScaler, and ExecuteXmlReader, the new execute methods are ExecutePageReader, ExecuteResultSet, and ExecuteRow. Figure 2 shows all of the execute methods supported by the command object in ADO.NET 2.0.

Screenshot - DataPagingImg1.jpg
Figure 2. Command's Execute methods.

Summary

ADO.NET 2.0 provides many new and improved features for developers to improve the performance and reduce the code.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

A list of licenses authors might use can be found here


Written By
United States United States
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
Generali must read more from others before making code Pin
Raf1236-Jul-09 16:19
Raf1236-Jul-09 16:19 
I didn't know about the bulk copy. probably saved me some time before.
thx anyway, I'll need it certainly someday.

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.