Click here to Skip to main content
14,970,304 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I need to efficiently append binary data into a file because this would happen in a very high rate (appending ~10KB every 10-500 milliseconds). I am using a binary writer to append, but I wonder if and how can I prevent the file from becoming fragmented after appending a lot of chunks of bytes into it.
Is there a way to pre-allocate the file overall size while still appending to the end of the data location only.

Looking for a solution on cross platform with .NET Core (C#).

What I have tried:

I have this code:
private static void AppendData(string filename, byte[] Data)
    using (var fileStream = new FileStream(filename, FileMode.Append, FileAccess.Write, FileShare.None))
    using (var bw = new BinaryWriter(fileStream))
Updated 15-Jun-21 2:30am
Garth J Lancaster 15-Jun-21 7:46am
Can you use a circular queue to hold updates, then combine 'n' updates into a larger buffer, then write that ?
Gerry Schmitz 15-Jun-21 13:55pm
You could look into compressing your data (in memory) first. Depending on the data, the savings can be significant. I do with my "data resource" dll's

1 solution

Not without adding a fair amount of complication: you'd have to write a good chunk of "blank" date to your file, remember where that started, then when you want to write more check if there is enough space:
If so, Seek to that point, write the data, and move your "next" pointer.
If not, write your data plus a good chunk of "blank" data", and repeat.

And ... if anyone else edits the file, it all falls apart and you get data integrity problems.

I'd be tempted to keep each chunk in a separate file, and then "block them together" with a different thread or process at intervals.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900