Click here to Skip to main content
15,886,519 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
I've a two tables in SQL server 2008. One is tempTable(with 28 columns) which contains 1.7 million rows. It also has an identity column, named RowID . Other is BaseTable(with 52 columns). Now I've to insert all the rows from tempTable to BaseTable with modified value. I did like following ...
SQL
declare @i int
select @i=MAX(RowID) from tempTable
declare @a varchar(10), @b datetime, @c char(10), @d bit, @e int=0, ... and so on
while(@i<0)
BEGIN
select @a=Col1, @b=Col2, @c=Col3, @d=Col4, @e=Col5, ....so on ... from tempTable where RowId=@i
set @a += 'Hello'
set @b = datediff(...)
if(@e>10000) then
begin @d=1 end
set @e = cast(Col9 as int)*1.1
.
.
.
.
so on.......(thus I've to edit all the values)

insert into BaseTable(Col2, Col3, Col6, ......) values (@c, @a, @d, @b, @e, ........)
set @i -= 1
END

But it takes long time for insertion ...
Is there any other way to update tempTable & then insert into BaseTable effectively ???
Thanks in Advance...
Posted
Updated 18-Mar-13 6:19am
v2

If you are using temTable only for temporary storage, try using an in-memory temporary table[^] instead of tempTable. It might give you better performance.
 
Share this answer
 
Take a look at Inserting Rows by Using INSERT and SELECT Subqueries[^], which should help you get rid of the loop. Also as Abhinav suggested try to use a in-memory temporary table instead of tempTable.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900