Hi
Iam fetching 25 lakh (250,000) records from my sql and the records have 5 columns in the table..
the fetched records are imported to CSV file...
for 6 or 7 lakhs records its fetching and exported to csv file..
but for more than 20 lakhs records its throwing the system out of memory exception..
im using this code to export the records to CSV...
public static void ExportDataTableToCSV(DataTable dt, string FileName, string Heading, HttpResponse objResponse)
{
System.Text.StringBuilder sb = new System.Text.StringBuilder();
sb.AppendLine(Heading);
string[] columnNames = dt.Columns.Cast<DataColumn>().
Select(column => column.ColumnName).
ToArray();
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dt.Rows)
{
string[] fields = row.ItemArray.Select(field => field.ToString().Replace(',',' ')).
ToArray();
sb.AppendLine(string.Join(",", fields));
}
objResponse.Write(sb.ToString());
objResponse.ContentType = "application/CSV";
objResponse.AddHeader("content-disposition", "attachment; filename="+FileName+".CSV");
objResponse.Flush();
objResponse.End();
}
My issue is that im not able to fetch huge amount of records like 25 lakhs from mysql and
pls give me any other techniques to export the fetched records...
Is there any option to fill the data using json techniques or any other techniques..pls suggest me some answers...
Im importing 90 MB data from database...