Hi Guys,
I'm trying to insert multiple rows into a database. Rather than looping through all records and inserting them one at a time, I'm hoping to be more efficient and do a bulk insert (i.e. one statement to send all data in one go). Do you know if there's a way to do this?
FYI: I'm hoping to use the DAAB (Data Access Application Block) provided by EntLib (MS Enterprise Library 4.1). I can't take advantage of the SQLBulkCopy command as my destination db is Sybase ASE 12.5, though my source is SQL Server 2008. Ideally I'd like the solution to be non DB specific in case we need to reuse this solution for communication with other systems' dbs.
Example / PoC code so far is included below. Here I'm assuming that the source system has a stored procedure which returns the data in the same format as it appears in the table to which it will be uploaded in the destination system (i.e. column names, types and order are the same).
class Program
{
const CommandType COMMAND_TYPE = CommandType.StoredProcedure
const string COMMAND_TEXT = "MyStoredProc"
const string SOURCE_TABLE = "Table"
const string DEST_TABLE = "Customers"
const string SOURCE_DB = "Source"
const string DEST_DB = "Destination"
static void Main(string[] args)
{
Database source = DatabaseFactory.CreateDatabase(SOURCE_DB)
Database destination = DatabaseFactory.CreateDatabase(DEST_DB)
using (DataSet ds = source.ExecuteDataSet(COMMAND_TYPE, COMMAND_TEXT))
{
destination.UploadLoadDataSet(DEST_TABLE, ds, SOURCE_TABLE)
}
}
}
The reason I'm hoping to minimise the number of calls being made is this application will be running in the UK, pulling data from a DB here, but sending it over the WAN to a Sybase DB in France. There will be a few million rows of data. Each individual call to the DB would add another round trip, so will severly impact performance. My backup plan is to use scripts to perform BCP exports & imports on each side, but that makes catching exceptions a little harder and seems a little hacky.
Thanks in advance for your help,
JB