Click here to Skip to main content
14,659,573 members
Articles » Database » Database » Utilities
Posted 18 May 2012


26 bookmarked

Retrieving failed records after an SqlBulkCopy exception

Rate this:
4.98 (18 votes)
Please Sign up or sign in to vote.
4.98 (18 votes)
19 Jun 2012CPOL
How to get a list of data rows that caused an exception (or several of them) in a SqlBulkCopy operation


Let me start by saying that the idea I used in this article is not originally mine, but since I have only heard of it and have not been able to find any actual examples of its implementation anywhere I wrote the code to handle it.

With that out of the way - here's what this is about: Anyone who's worked with .NET's SqlBulkCopy class knows how fast and powerful it is. It beats other mechanisms for pumping large quantities of data into an SQL Server database by huge factors, and one of the reasons it is so fast is that it does not log anything it does.

The lack of logging definitely speeds things up, but when you are pumping hundreds of thousands of rows and suddenly have a failure on one of them because of a constraint, you're stuck. All the SqlException will tell you is that something went wrong with a given constraint (you'll get the constraint's name at least), but that's about it. You're then stuck having to go back to your source, run separate SELECT statements on it (or do manual searches), and find the culprit rows on your own.

On top of that, it can be a very long and iterative process if you've got data with several potential failures in it because SqlBulkCopy will stop as soon as the first failure is hit. Once you correct that one, you need to rerun the load to find the second error, etc.

The approach described in this article has the following advantages: 

  • Reports all possible errors that the SqlBulkCopy would encounter
  • Reports all culprit data rows, along with the exception that row would be causing
  • The entire thing is run in a transaction that is rolled back at the end, so no changes are committed.

... and disadvantages:

  • For extremely large amounts of data it might take a couple of minutes.
  • This solution is reactive; i.e. the errors are not returned as part of the exception raised by your SqlBulkCopy.WriteToServer() process. Instead, this helper method is executed after the exception is raised to try and capture all possible errors along with their related data. This means that in case of an exception, your process will take longer to run than just running the bulk copy.
  • You cannot reuse the same DataReader object from the failed SqlBulkCopy, as readers are forward only fire hoses that cannot be reset. You'll need to create a new reader of the same type (e.g. re-issue the original SqlCommand, recreate the reader based on the same DataTable, etc).


The main idea is quite simple. Rerun the bulk copy, but only process one row at a time. As the rows are processed, capture the individual exceptions that copying them raises (if any) and add both the message and the row's data to an incremental message, but don't stop copying the data to the server. When all is said and done, your final error message is a nice log showing all the issues and the data that caused them. From that point it's easy to go back to the source, find those records, fix the issues and then reissue the bulk copy.

Using the code

It's important to note that not all failures on a bulk copy happen because of data. You might have connectivity issues, authentication failures, timeouts, etc. None of these cases would be explained by your data, so there's no point in calling this helper method if this is your case. You need to take this into account when calling the helper method, and only call it for specific types of exceptions (the sample code below takes care of this).

Also consider that the exception you're catching may not necessarily be the one raised by SqlServer and could be contained within an inner Exception. So if you plan on calling the helper method only if a data-related issue occurred, the exception (and all inner exceptions) needs to be inspected for this. The sample code below takes care of this, even though the Exception is coming directly from the server; in your case, you might be handling it at a higher level after the it has been wrapped in other exceptions.

Test bulk copy method

TestMethod() below is a simple method that sets up for a bulk copy operation and encloses it in a try/catch block. It is this bulk copy that supposedly fails because of some data issue, so within the catch block we then check the exception (and all inner exceptions) for a message containing the word "constraint" (which is apparently the only way to find a constraint failure, as all exceptions from SqlServer are of type SqlException). If such an exception message is found, we call GetBulkCopyFailedData() in order to get the failed rows. This latter method would ideally reside in a separate helper-type class.

Granted, this checking could have been done within the helper, but I was trying to keep it generic enough so that would show all exceptions and not assume what the caller wanted to filter out.

private void TestMethod()
   // new code
   SqlConnection connection = null;
   SqlBulkCopy bulkCopy = null;
   DataTable dataTable = new DataTable();
   // load some sample data into the DataTable
   IDataReader reader = dataTable.CreateDataReader();
      connection = new SqlConnection("connection string goes here ...");
      bulkCopy = new SqlBulkCopy(connection); 
      bulkCopy.DestinationTableName = "Destination table name";
   catch (Exception exception)
      // loop through all inner exceptions to see if any relate to a constraint failure
      bool dataExceptionFound = false;
      Exception tmpException = exception;
      while (tmpException != null)
         if (tmpException is SqlException
            && tmpException.Message.Contains("constraint"))
            dataExceptionFound = true;
         tmpException = tmpException.InnerException;

      if (dataExceptionFound)
         // call the helper method to document the errors and invalid data
         string errorMessage = GetBulkCopyFailedData(
         throw new Exception(errorMessage, exception);
      if (connection != null && connection.State == ConnectionState.Open)

Documenting the errors and faulty data rows

GetBulkCopyFailedData() then opens a new connection to the database, creates a transaction, and begins bulk copying the data one row at a time. It does so by reading through the supplied DataReader and copying each row into an empty DataTable. The DataTable is then bulk copied into the destination database, and any exceptions resulting from this are caught, documented (along with the DataRow that caused it), and the cycle then repeats itself with the next row.

At the end of the DataReader we rollback the transaction and return the complete error message. Fixing the problems in the data source should now be a breeze.

/// <summary>
/// Build an error message with the failed records and their related exceptions.
/// </summary>
/// <param name="connectionString">Connection string to the destination database</param>
/// <param name="tableName">Table name into which the data will be bulk copied.</param>
/// <param name="dataReader">DataReader to bulk copy</param>
/// <returns>Error message with failed constraints and invalid data rows.</returns>
public static string GetBulkCopyFailedData(
   string connectionString,
   string tableName,
   IDataReader dataReader)
   StringBuilder errorMessage = new StringBuilder("Bulk copy failures:" + Environment.NewLine);
   SqlConnection connection = null;
   SqlTransaction transaction = null;
   SqlBulkCopy bulkCopy = null;
   DataTable tmpDataTable = new DataTable();
      connection = new SqlConnection(connectionString); 
      transaction = connection.BeginTransaction();
      bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.CheckConstraints, transaction);
      bulkCopy.DestinationTableName = tableName;
      // create a datatable with the layout of the data.
      DataTable dataSchema = dataReader.GetSchemaTable();
      foreach (DataRow row in dataSchema.Rows)
         tmpDataTable.Columns.Add(new DataColumn(
      // create an object array to hold the data being transferred into tmpDataTable 
      //in the loop below.
      object[] values = new object[dataReader.FieldCount];

      // loop through the source data
      while (dataReader.Read())
         // clear the temp DataTable from which the single-record bulk copy will be done

         // get the data for the current source row

         // load the values into the temp DataTable
         tmpDataTable.LoadDataRow(values, true);

         // perform the bulk copy of the one row
         catch (Exception ex)
            // an exception was raised with the bulk copy of the current row. 
            // The row that caused the current exception is the only one in the temp 
            // DataTable, so document it and add it to the error message.
            DataRow faultyDataRow = tmpDataTable.Rows[0];
            errorMessage.AppendFormat("Error: {0}{1}", ex.Message, Environment.NewLine);
            errorMessage.AppendFormat("Row data: {0}", Environment.NewLine);
            foreach (DataColumn column in tmpDataTable.Columns)
                  "\tColumn {0} - [{1}]{2}", 
   catch (Exception ex) 
      throw new Exception(
         "Unable to document SqlBulkCopy errors. See inner exceptions for details.", 
      if (transaction != null)
      if (connection.State != ConnectionState.Closed)
   return errorMessage.ToString();


I've certainly wasted more than enough time trying to figure out what was wrong with my data because the bulk copy operation wouldn't help me out there, so I hope this helps avoid wasted time for someone else as well.

As always - comments, questions and suggestions are always welcome. And please don't forget to vote! 


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

David Catriel
Team Leader
Canada Canada
A developer that's been tinkering with computers since he first laid eyes on his buddy's Atari in the mid 80's and messed around with GWBasic and Logo. He now divides his time among his wife, kids, and evil mistress (a term lovingly [ahem...] given to his computer by the wife ...).

For more info, please see my LinkedIn profile:

Comments and Discussions

QuestionHow about timeout when doing Bulk Inserts Pin
IAM-Isabelle14-Feb-18 9:28
MemberIAM-Isabelle14-Feb-18 9:28 
QuestionError : Login failed for user 'sa'. Pin
Member 1305961030-Mar-17 17:45
MemberMember 1305961030-Mar-17 17:45 
QuestionThank you Pin
FoxtrotBravo20-Mar-17 0:26
MemberFoxtrotBravo20-Mar-17 0:26 
QuestionRerun with BatchSize=1 to find first error row... Pin
Peter Kierstead7-Apr-16 16:42
MemberPeter Kierstead7-Apr-16 16:42 
QuestionIs this the best way to do bulk operations in 2016 Pin
PureDeveloper5-Apr-16 14:09
MemberPureDeveloper5-Apr-16 14:09 
QuestionPinpoint error to column Pin
Member 99602052-Apr-14 17:38
MemberMember 99602052-Apr-14 17:38 
AnswerRe: Pinpoint error to column Pin
David Catriel3-Apr-14 1:51
MemberDavid Catriel3-Apr-14 1:51 
QuestionNice, but why use a reader again? Pin
msmits4-Nov-13 3:54
Membermsmits4-Nov-13 3:54 
AnswerRe: Nice, but why use a reader again? Pin
David Catriel2-Jan-14 3:23
MemberDavid Catriel2-Jan-14 3:23 
GeneralRe: Nice, but why use a reader again? Pin
msmits5-Jan-14 21:22
Membermsmits5-Jan-14 21:22 
GeneralRe: Nice, but why use a reader again? Pin
Heriberto Lugo13-Feb-18 10:18
MemberHeriberto Lugo13-Feb-18 10:18 
GeneralMy vote of 5 Pin
Kanasz Robert25-Sep-12 22:51
MemberKanasz Robert25-Sep-12 22:51 
GeneralRe: My vote of 5 Pin
David Catriel25-Sep-12 23:21
MemberDavid Catriel25-Sep-12 23:21 
GeneralMy vote of 5 Pin
FranciscoLeon15-Jun-12 10:26
MemberFranciscoLeon15-Jun-12 10:26 
GeneralRe: My vote of 5 Pin
David Catriel15-Jun-12 10:57
MemberDavid Catriel15-Jun-12 10:57 
QuestionNice Pin
mikeperetz12-Jun-12 17:52
Membermikeperetz12-Jun-12 17:52 
GeneralMy vote of 5 Pin
cjb11020-May-12 21:24
Membercjb11020-May-12 21:24 
QuestionDo batches rather than every record individually Pin
Michael197320-May-12 1:04
MemberMichael197320-May-12 1:04 
AnswerRe: Do batches rather than every record individually Pin
David Catriel20-May-12 2:43
MemberDavid Catriel20-May-12 2:43 
GeneralRe: Do batches rather than every record individually Pin
Michael197320-May-12 4:39
MemberMichael197320-May-12 4:39 
GeneralRe: Do batches rather than every record individually Pin
David Catriel20-May-12 7:39
MemberDavid Catriel20-May-12 7:39 
GeneralRe: Do batches rather than every record individually Pin
supercat920-May-12 13:53
Membersupercat920-May-12 13:53 
GeneralRe: Do batches rather than every record individually Pin
David Catriel20-May-12 15:02
MemberDavid Catriel20-May-12 15:02 
GeneralRe: Do batches rather than every record individually Pin
cjb11020-May-12 21:23
Membercjb11020-May-12 21:23 
GeneralRe: Do batches rather than every record individually Pin
David Catriel21-May-12 2:18
MemberDavid Catriel21-May-12 2:18 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.