This article identify places in a query where database developer or administrator need to pay attention in desiging insert query depending on size of records so that perforamance of insert query get improved.
Article provides parameter which need to set to improve insert operation performance like when and which operation need to use in a query to improve the performance.
Guidelines of Insert Query
Before designing query for insert , some points need to be consider while writing insert query depending on size of records.
- Prefer to use table value parameter if no of records less than 1000 rows.
- Insert......Select format to insert records wherein "Select" query doesn't have more joins.
- Use tablocks hint on table on which insertion script will be executed
- Preferably use sp_Executesql to execute dynamic sql query within procedure which has insert script. sp_executesql stop recompilation of Query and use cache plan.
- Use Common Table Expression (CTE) if "Select" query contains many "join" on tables
- Set recovery model to Bulk_Logged or Simple for minimal logging of insert operation. This will improve performance. Reset recovery model to "Full" or whatever it was earlier.
- When Using Bulk Insert Query , use "ignore_constraint" , "TabLock" hint to optimize the query. ignore_constraint skip use of constraint and index for bulk insert operation.
- Use Partition Switch facility like below if you are working on Enterprise partition.
**** Remeber below Partition Switch query will move data from source to target.
Tartet table Should not have index and should be identical to Source table. After running below partition switch query it will give warning message " There is no partition created on these tables". But this would be warning only and your data will move to target table.
Basically Data does not move to target table , but partition of one table get moved to other table.
Alter table test_source switch partition 1 to test_target partition 1
insert into tab1 ( Col1,col2,col3..) with(ignore_constraint,tablock) Select * from Openrowset(Bulk 'Data_File_path.dat')
Scenario - 1 - if there are many joins between table , Use below method to insert records
Create Table Test_Insert
Col1 integer identity(1,1),
Col2 Varchar(100) ,
constraint pk_col1 on col1
With CTE_Insert as( Select a.col1,b.col2,c.col3 from test_insert a inner join table1 b [Joining Condition] inner join table2 c [join condition]) Insert into Test_Insert with(tablock) Select col1,col2,col3 From CTE_Insert where [condition if any]
Scenario 2 - When records are less than 1000 and having less join
Any Insert method can provide relatively results with almost same time but try to use without using CTE
Insert into tab4 (col1,col2,col3)
Select col1,col2,col3 From Tab1 [inner/left/right join tab2 condition]
Scenario 3 - if there is provision of using Bulk insert as you are using Enterprise Edition then go ahead with Bulk Insert method.
But keep in mind that BULK INSERT inserts records from File so there should have data file which can be use to insert record into table.
Moreover Bulk Insert operation will not be logged in transaction log file but only action [Bulk Insert] will be logged.