Click here to Skip to main content
15,881,380 members
Please Sign up or sign in to vote.
4.00/5 (1 vote)
I work on SQL server 2012

I have temp table get data from excel and based on data exist on excel i insert on table inside loop

temp table always have big amount of data may be at least 5000 or 10000 or 15000

I need every iteration increased by 5000 rows insert from temp table 

so that i need best solutions for that according to speed and memory like that 

and if there are any thing not correct as logic please tell me

my Query as below :


What I have tried:

SQL
Create Table [dbo].PartsData
(
    BatchID nvarchar(50) primary key,
    RowNumber int,
    GivenPartNumber nvarchar(50),
	GivenManufacturer nvarchar(100)

)

--select * from [dbo].PartsData
Create Table [dbo].[Type_ValidationInPut]
(
    RowNumber int,
    GivenPartNumber nvarchar(50),
	GivenManufacturer nvarchar(100)

)

create table #Temp(
	DocumentPartID int identity(1,1),
	CompanyName VARCHAR(4000),
	[AffectedProduct] NVARCHAR(4000),
	[ReplacementPart] VARCHAR(4000) , 
	[ReplacementCompany] VARCHAR(4000) ,
	[Category] VARCHAR(4000) ,

    
	DocumentID int null,  
	CompanyID VARCHAR(4000) null,
	PartID int null,
	ReplacementPartID int null,
	CategoryID  int null,
	[Status]  VARCHAR(4000) null ,


)



insert into #Temp
(
CompanyName ,
[AffectedProduct],
[ReplacementPart],
[ReplacementCompany],
[Category]
)  
values
('Nokia','RF1550','RF1550','HTS','HTS'),
('IPHONE','TF1545','TF1545','Corning Incorporated','HTS2')



DECLARE @MaxValue int = ( select Max(DocumentPartID) from #Temp)
DECLARE @Currentindex int =0

DECLARE @Rows  [dbo].[Type_ValidationInPut];	
		   while @Currentindex < @MaxValue
            begin 
			
		  
              DELETE @Rows
              INSERT  INTO @Rows
		                (
						RowNumber ,
				 GivenPartNumber ,
                  GivenManufacturer       
                ) 
           
		   
			select TOP 5000 DocumentPartID , isnull(AffectedProduct,''), isnull(CompanyName,'') FROM #Temp where 
			
			(CategoryID = 517884 or CategoryID = 1110481)  and (DocumentPartID > @Currentindex) and [Status] is null 

			    INSERT  INTO @Rows
		                (
                 RowNumber ,
				 GivenPartNumber ,
                  GivenManufacturer       
                ) 
             
		
			select TOP 5000 DocumentPartID, isnull(substring(ReplacementPart,0,70),''), isnull(ReplacementCompany,'') FROM #Temp where   
			(DocumentPartID > @Currentindex) and  [Status] is null and ReplacementPart is not null
		

			DECLARE @NewID nVARCHAR(4000) =newID()
			insert into [dbo].PartsData (BatchID,RowNumber,GivenPartNumber,givenmanufacturer) 
			SELECT  @NewID ,0,GivenPartNumber,GivenManufacturer from  @Rows 
			

			
			set @Currentindex = @Currentindex +5000
			DELETE @Rows
             end
Posted
Updated 13-Feb-20 6:33am
v2

The first thing that stood out to me in this was a poor choice in a variable declaration.
I see NO valid reason for this as @NewID will always be the same size; you would be better off with using NCHAR(37)
SQL
DECLARE @NewID nVARCHAR(4000) =newID()
Naturally to try and help you out I copied/pasted your code into SSMS and started with reformatting.

Your #temp table itself is asking for memory loads. Did you know there is a line length limit in SQL Server of 8060? Once you hit cross that threshold it takes multiple pages of memory to hold each row. As it sits right now your table is well over 32,0000 bytes.
Do you really need to define every non-integer as 4000 characters long?
SQL
create table #Temp(
   DocumentPartID      int identity(1,1),
   CompanyName         VARCHAR(4000),
   AffectedProduct     NVARCHAR(4000),
   ReplacementPart     VARCHAR(4000) , 
   ReplacementCompany  VARCHAR(4000) ,
   Category            VARCHAR(4000) ,
   
   DocumentID         int null,  
   CompanyID          VARCHAR(4000) null,
   PartID             int null,
   ReplacementPartID  int null,
   CategoryID         int null,
   [Status]           VARCHAR(4000) null
)
Once I get it all formatted nicely into something readable, I notice a few more things:

1. Either there is some code missing, or your first INSERT statement will never fire; as CategoryID is never populated.

2. The first and last items in your loop both delete @Rows. Should only need to be done once within the loop.

If this routine is going to be used often; I would recommend that you create an SSIS (ETL: Extract-Transform-Load) package for it which will have a much lower impact on your SQL Server resources and operate much faster, and can easily process billions of records.
 
Share this answer
 
Comments
ahmed_sa 13-Feb-20 1:35am    
OK and what about select top 5000 exist inside loop
are this OK or have something issue
ahmed_sa 13-Feb-20 4:17am    
can i write my query above without using while loop solutions
MadMyche 13-Feb-20 7:04am    
Of course it's possible, but that is on you.
You need to figure out what you really need for a design and fine tune from there.
I have neither all of the information or the time to do what may need to be done here.
ahmed_sa 13-Feb-20 7:10am    
I Update my script query full so can you help me please if p[ossible
if there are any solution without using while loop
Jörgen Andersson 13-Feb-20 7:58am    
They are Varchars, so they don't actually take more space than needed.
That said, one should never define a column larger than what is necessary, as that is part of the sanity check of the data.
One way of doing batch inserts:
SQL
DECLARE @Batchsize INT = 5000;
DECLARE @RowCount INT = @Batchsize;
DECLARE @LastID INT;

WHILE @RowCount = @Batchsize
BEGIN
    SET @LastID = (
            SELECT ISNULL(max(MyTableID), - 1)
            FROM MyTable
            );

    INSERT INTO MyTable (MyTableID,OtherColumns)
    SELECT  TOP (@Batchsize) 
            MyTableID
           ,OtherColumns
    FROM    TheSourceTable
    WHERE   MyTableID > @LastID
    ORDER BY MyTableID;

    SET @RowCount = @@ROWCOUNT;
END;
Adjust as necessary
 
Share this answer
 
Going on from solution one, and utilising an ETL process you can use the import date from tasks in a Database

right click on the DB

and got tasks and then on the tasks choose import data. A MSFT wizard appears and follow it through to excel and the excel file can be mapped to table

see link below

Import and Export Data with the SQL Server Import and Export Wizard - SQL Server Integration Services (SSIS) | Microsoft Docs[^]
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900