Click here to Skip to main content
15,886,110 members
Please Sign up or sign in to vote.
4.50/5 (2 votes)
Hi Guys,

I'm trying to insert multiple rows into a database. Rather than looping through all records and inserting them one at a time, I'm hoping to be more efficient and do a bulk insert (i.e. one statement to send all data in one go). Do you know if there's a way to do this?
FYI: I'm hoping to use the DAAB (Data Access Application Block) provided by EntLib (MS Enterprise Library 4.1). I can't take advantage of the SQLBulkCopy command as my destination db is Sybase ASE 12.5, though my source is SQL Server 2008. Ideally I'd like the solution to be non DB specific in case we need to reuse this solution for communication with other systems' dbs.

Example / PoC code so far is included below. Here I'm assuming that the source system has a stored procedure which returns the data in the same format as it appears in the table to which it will be uploaded in the destination system (i.e. column names, types and order are the same).

MSIL
class Program
{
    const CommandType COMMAND_TYPE = CommandType.StoredProcedure;
    const string COMMAND_TEXT = "MyStoredProc"; //pulls back data in the format expected by the desintaiton table
    const string SOURCE_TABLE = "Table";        //tables returned by stored procs are always called Table
    const string DEST_TABLE = "Customers";      //generic example table name
    const string SOURCE_DB = "Source";
    const string DEST_DB = "Destination";
    static void Main(string[] args)
    {
        Database source = DatabaseFactory.CreateDatabase(SOURCE_DB);
        Database destination = DatabaseFactory.CreateDatabase(DEST_DB);
        using (DataSet ds = source.ExecuteDataSet(COMMAND_TYPE, COMMAND_TEXT))
        {
            //begin made up code to illustrate
            destination.UploadLoadDataSet(DEST_TABLE, ds, SOURCE_TABLE);
            //end made up code to illustrate
        }
    }
}


The reason I'm hoping to minimise the number of calls being made is this application will be running in the UK, pulling data from a DB here, but sending it over the WAN to a Sybase DB in France. There will be a few million rows of data. Each individual call to the DB would add another round trip, so will severly impact performance. My backup plan is to use scripts to perform BCP exports & imports on each side, but that makes catching exceptions a little harder and seems a little hacky.

Thanks in advance for your help,

JB
Posted

What about sp_executesql (if available in sybase)? You could combine all the insert statements as a single statement separated by ';' and call sp_executesql or something similar in sybase. I guess every database should have something similar to this.

Also, I am not sure if this is the best way to do what you need. Anyways, my 2 cents!
 
Share this answer
 
Comments
JohnLBevan 5-May-11 6:06am    
Thanks Karthik. Unfortunately sp_executesql does not exist in Sybase, though exec() does. I had a play based on this idea and came up with some code to dynamically generate SQL to insert the data in batches, using single statements in the form:
insert destinationTable (col1, col2) select r1c1val, r1c2val union select r2c1val, r2c2val;
This works well for sending 100 records at a time, and gives a significant boost over sending the records one at a time (over 10 times faster). However, when I tried to send 1000 records at a time I received an error back from sybase due to insufficient stack space, so clearly this uses up a lot of resources on the DB side.
Would it be possible to simply transfer the table as a binary file using FTP or TCP/IP, and then let the remote machine do all the database work itself?

If you want to ensure some kind of recovery system (in the event the transfer is interrupted), you could send smaller files and alt least beable to restart the transfer at a known point.
 
Share this answer
 
v2
Comments
JohnLBevan 5-May-11 6:10am    
Hey John, thanks for the suggestion. I like this idea, but feel it will be a little too complex. My aim is to keep the solution simple to make it easier to support and resolve errors. The Sybase machine is hosted on Unix, so playing with binary files between windows and this OS scares me; i.e. if something went wrong, I wouldn't know where to start looking for the issue. It's a good suggestion for anyone more comfortable with those technologies though.
If you were to use only SQL Server to SQL Server, you can use SQL Server Replication
 
Share this answer
 
Comments
JohnLBevan 5-May-11 6:13am    
Thanks Silim. Sadly I have no control over the DB used in France, and there would be a huge refactoring exercise if they did change over. On a similar theme, we have played with the idea of setting up sybase as a linked server to allow a stored proc in SQL to send the data direct to Sybase. The issue there is we only have x86 providers for Sybase, and our SQL install is x64, so it wouldn't be able to see the DSN.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900