Click here to Skip to main content
15,881,757 members
Please Sign up or sign in to vote.
1.00/5 (4 votes)
See more:
Hello Friends,

Currently i am facing data table loading issue.My query execute fast but return 50000 records and when i am tracing my code ,its shows that Data table fill taking much time.I want to store my Data in Datatable and want to export in Excel but its taking too much time. Please request if any one has its best solution.

Note : I am taking this much data because client need it in excel.

Thanks,
Chintan Rana

What I have tried:

My Code As below
-----------------
C#
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "sp_web_reports";
cmd.Parameters.AddWithValue("@mode", 50);
cmd.Parameters.AddWithValue("@From_Date", Convert.ToDateTime(txt_from_date.Text));
cmd.Parameters.AddWithValue("@TO_Date", Convert.ToDateTime(txt_to_date.Text));
cmd.Parameters.AddWithValue("@GroupCode", drp_vehicle_group.SelectedValue);
conn.Open();
cmd.Connection = conn;
cmd.CommandTimeout = 0;
SqlDataReader reader = cmd.ExecuteReader();
conn.Close();
DataTable dt = new Good_Issue_Vehicle_Report.dt_GoodIssueVehicleDataTable();
dt.Load(reader);   <----- Taking too much time
if (dt.Rows.Count > 0)
{
Export Code in Excel
}
Posted
Updated 9-Jul-18 9:29am
v6
Comments
OriginalGriff 9-Jul-18 5:53am    
Show us the relevant code fragments - we have no idea what you dode is doing, or what you consider "fast" or "slow"
Member 13165845 9-Jul-18 6:28am    
Done,please check my above question again.
Kornfeld Eliyahu Peter 9-Jul-18 5:56am    
Are you sure you need 50000 records in a data structure that mainly was created to connect to UI? How exactly end user will handle that much data?
Member 13165845 9-Jul-18 6:29am    
Yes i need this much records and in future it will be increase. Please guide me how i will manage it. My main purpose is to export this data to excel for my client.
Er. Puneet Goel 9-Jul-18 6:32am    
please answer to "is your sp 'sp_web_reports' is using Dynamic Query? Just provide us with some information."

Also, please inform how much time SSMS is taking?

There is not too much you can do here as long as you use DataTable, which is a disconnected structure (means it will load all the data from the SQL in one loop)...
As it seems you final goal is to export to Excel (CSV?) you may use a DataReader (you already have one) and loop over it and export data record-by-record... In this case you may gain huge performance...
Also consider NOT to bring anything unnecessary from the SQL - it takes up more memory!
 
Share this answer
 
Try it this way:

C#
SqlConnection  conn      = null;
SqlCommand     cmd       = null;
SqlDataAdapter adapter   = null;
DataSet        dataset   = null;
DataTable      dataTable = null;

try
{
    using (conn = new SqlConnection("connection string"))
    {
        conn.Open();
        using (cmd = new SqlCommand("sp_web_reports", conn) { CommandType = CommandType.StoredProcedure })
        {
            cmd.Parameters.AddWithValue("@mode",      50);
            cmd.Parameters.AddWithValue("@From_Date", Convert.ToDateTime(txt_from_date.Text));
            cmd.Parameters.AddWithValue("@TO_Date",   Convert.ToDateTime(txt_to_date.Text));
            cmd.Parameters.AddWithValue("@GroupCode", drp_vehicle_group.SelectedValue);
            using (adapter = new SqlDataAdapter(cmd))
            {
                dataset = new DataSet();
                adapter.Fill(dataset);
                dataTable = dataset.Tables[0];
            }
        }
    }
}
catch (Exception ex)
{
    // handle the exception
}

// if you get here without an exception, you can work with your dataTable object.


The using statement has the benefit of cleaning up the adapter, command, and connection objects without writing code to specifically handle that. Furthermore, you should wrap ALL ADO code with try/catch blocks and handle the exceptions appropriately. Defining the various objects before the first using statement allows you to examine them in the debugger in the event an exception is raised.

If you want to export to Excel, all you have to do is call dataTable.WriteXML("filename"), and Excel 2007+ can read it.
 
Share this answer
 
v4
Comments
Kornfeld Eliyahu Peter 9-Jul-18 7:29am    
Why do you think it will perform better?
As I see (from the code) both DbAdapter.Fill and DataTable.Load do the same - using a reader...
#realJSOP 9-Jul-18 7:40am    
Just throwin' darts. I have no problems with lag doing it this way. Even if it doesn't work any *better* than his original code, at least he has better code that still works at least as well. The 1-vote is not appropriate.
Kornfeld Eliyahu Peter 9-Jul-18 7:43am    
No doubt your code is much readable... I was just curious, you may have a secret here... Too bad... Maybe OP will try and confirm - or not :-)
#realJSOP 9-Jul-18 7:47am    
He would be remiss in not trying it. He has nothing to lose. I've only been a programmer for 40 years, so what do I know? Furthermore, we still haven't seen his stored proc, and that's probably where the problem is.
[no name] 9-Jul-18 19:29pm    
Why would your solution alone deserve a downvote.
As far as I can see:
1) you forgot to tell how long is 'long'.
2) The way your stored procedure join the 2 tables have impact on execution time, but you didn't showed that procedure.
3) the database structure and indexes matters too, but it is mystery too.

The bad news: There is no secret super fast magic command that will give you instant result of 50k records.
Only the correct database design and correct query will minimize the run time, every mistake will dramatically downgrade performance.
 
Share this answer
 
Have you reviewed your indexing on the tables in question?
Does your SQL Statement have a "select *"?

c# - DataTable Load very slow - Stack Overflow[^]
c# - SQL Query that runs fine in SSMS runs very slow in ASP.NET - Stack Overflow[^]
 
Share this answer
 
The DataTable is inherently slow to begin with. There's not much you can do to make that go faster.

The problem may be because you're query isn't optimized. We don't know because we can't see it in what you posted.

But, you also said the 50,000 records you're dealing with could be larger. Using a DataTable imposes a limit on the number of records you can deal with. You're trying to load the entire set of records into memory at the same time because of the limitation on RAM in the machine. It's a really bad idea to load that number of records into memory.

Since you're exporting the data to Excel, don't use a DataTable. Use a DataReader instead. that will only load one record at a time into memory. Load the record, export it to your Excel sheet, load the next one and repeat until done. You don't have any memory limit issues to deal with.
 
Share this answer
 
You should be using Excel's "Data" feature to import directly into Excel from SQL Server.

Too many intermediate steps otherwise.

Or export from SQL Server as CSV; open in Excel.

etc.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900