Click here to Skip to main content
15,884,388 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
I Fill a dataset by sql query and the dataset contains a large datatable.

DataSet dataSet1 = new DataSet();
SqlDataAdapter ndaGlobalClass = new SqlDataAdapter(Query, cn);
ndaGlobalClass.SelectCommand.CommandTimeout = 0; 
cn.Open();
ndaGlobalClass.Fill(dataSet1);
cn.Close();
string s=JsonConvert.SerializeObject(dataSet1.Tables[0]);


the Query return a large datatable and when i convert it to json by serializing System.OutOfMemoryException' was thrown. How can i fix this problem? i need to serialize large datatable and there is no circular issue.
Posted
Comments
OriginalGriff 4-Apr-15 6:06am    
How big is the data table?

1 solution

If you are using a large data table and you are getting out of memory issues, it may well be that the size of the JSON string is just too big for .NET - there is a limit of 2GB on any single object in .NET, and since JSON is a text-based serialization a large table could well exceed that even if the "raw" data table is considerably less than that.

Try an experiment: find out how many rows the table holds, and modify your query to return only half that: SELECT TOP nnn should do it. Then see if you can convert that to JSON and if so how big the resulting string is. That should give you an idea if this is just getting a bit silly size wise, and you might be better off finding a different way to transfer the data! :laugh:
 
Share this answer
 
Comments
TS11 5-Apr-15 11:52am    
I have already experimented the table data its about 1850000 rows there and i need to pass the total datatable as webservice response and yes when the row size is low then it works fine. but i am not able figure any different way to pass this as web response :-(
OriginalGriff 5-Apr-15 12:01pm    
Nearly 2 million rows? And JSON conversion? I'd guess you are running past the 2GB limit.
How big was the JSON string when you tried with "only" the first 1,000,000 rows?

And seriously? You're passing 2 million row of data around as a single lump? I'd break that into smaller sections - far, far smaller - as reassemble if needed. I have difficulty believing that there is anything using JSON that actually needs that much data in one go! :laugh:
Member 11069345 9-Apr-15 7:58am    
I break into smaller section like make multiple table of a single table and then convert the table into json string and add this into a list and i see the list when i run the webservice but when i call it from ajax again memory leak occur

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900