Click here to Skip to main content
15,889,440 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I used to read file by calling File.ReadAllBytes, but I got an OutOfMemory exception when I read large file.

So I tried FileStream and succeed in reading 700MB file into byte array.

C#
byte[] FileByte = null;
byte[] ResByte = null;

private void ReadFile(string FilePath)
        {
            using (FileStream fs = new FileStream(this.txtFilePath.Text, FileMode.Open, FileAccess.Read))
            {
                int length = (int)fs.Length;
                this.FileByte = new byte[length];
                int count;                         
                int sum = 0;                          

                while ((count = fs.Read(this.FileByte, sum, length - sum)) > 0)
                    sum += count;
            }

But when I tried the code below, I got another OutOfMemory exception.
C#
this.ResByte = new byte[this.FileByte.Length];


The length of FileByte is about 700.000.000, and I'm sure .NET array can load up to 2GB of bytes. Moreover, the first byte array successfully loaded. So how come the second byte array throw OutOfMemory exception?

An explanation will be helpful. :)
Posted

1 solution

Check the actual size of your file: any .NET object is restricted to 2GB, yes, so you can't create a byte array larger that that, but a 700MB array is not a problem under normal circumstances.

If the file is definitely 700MB, then you need to look at how much memory you have in your machine, how much is available and the free space on your HDD.

Try your code with a smaller file - one meg or similar - and see what happens. It may be that you are allocating more memory than can be provided.
I just tried your code with a 699Mb file, and a 1.4Gb file, and it worked fine with the 699, but failed with the 1.4GB, even if I tried this:
C#
FileInfo fi = new FileInfo(strFile);
FileByte = new byte[fi.Length];
I.e. without the stream involved.
It may be worth your checking what target you are building for: X86 will have a smaller "object size" limit than 64 bit applications.

I would suggest that you need to look at doing this in chunks, rather than reading it as a single massive lump anyway.
 
Share this answer
 
Comments
Midnight Ahri 31-May-14 9:33am    
Woah, I checked my memory left from task manager.
1.8 GB free, and when the first byte array loaded, 1.0 GB left.
And then an exception was thrown.

I had a question, I'm doing file encryption and it's already done except for large file.
Step by step are from reading file bytes into array, then while in the process of encryption, byte / byte are written in another array till complete, and finally the write process begin. In this case, I'm sure I had to load those bytes into memory and I had no choice except to limit the file size, or else I had to start the writing process while encryption is in progress. Is that possible?
OriginalGriff 31-May-14 10:01am    
I do encryption in blocks (partly because it means you don't have an upper file size limit at all, partly because it removes the unencrypted data from memory as you work your way though it, partly because it gets the encryption started faster, and partly because it generates "block" sized output files which gives less useful info to the "bad guys".

Most (if not all) the encryption methods in .NET use streams for input and out anyway, so there is no real need to read it into a buffer at all (because they use blocks as well!)

Are you writing your own encryption methods?
Midnight Ahri 31-May-14 10:24am    
Yes I am writing my own encryption methods.
You mean directly parse the file stream to .NET encryption methods?
All right then, I'll find a way to do it in blocks too.
Thank you very much for the information. :)

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900