Click here to Skip to main content
15,902,854 members
Please Sign up or sign in to vote.
2.00/5 (1 vote)
See more:
Hi
I have written a windows service using c#.net that will keep running for days together
I try to access thousands of files in a folder using below code:

string[] files = Directory.GetFiles(folder, "*.*" );

folder will have 20,000 zipped files or more.This part of code memory goes up to 5000k everytime.

To avoid the memory from going up,what would be the solution to access the folder without actually loading all files into memory.

Thanks
Posted
Comments
BillWoodruff 3-Mar-14 23:27pm    
I think we need more detail about what you actually do when you access the folder. Using Directory.GetFiles is just going to return an array of string; the question is: what are you doing with that array once you have it ?

Are you disposing the array after you have used it ?

1 solution

Review your architecture. This is the general rule for the memory leaks in .NET. Random leaks are much less likely, most of the leaks are due to wrong design/architecture. Please see my past answers:
Best way to get rid of a public static List Causing an Out of Memory[^],
Memory management in MDI forms[^],
Memory leak in WPF DataBinding[^],
deferring varirable inside the loop can cuase memory leak?[^],
Garbage collectotion takes care of all the memory management[^].

For further ideas, I would need to know what are you going to do with 20,000 files. I cannot believe anyone need to use them at the same time, or even see them. Hence, you need some kind of virtualization. For example, with UI you may want to use paging, searching for files and loading file names only when the next portion is required. However, to me, the whole idea to have all these files look quite questionable. Without knowing your purpose, I cannot suggest you any alternative but almost sure they do exist.

—SA
 
Share this answer
 
Comments
Sergey Alexandrovich Kryukov 4-Mar-14 10:05am    
It does not say much. Why cannot it be done in small chunks? Are there any criteria which need the full set of files, all 20,000 or so?
—SA
DepiyaReddy 4-Mar-14 16:25pm    
No such criteria to access all 20,000,like you said i will doing it in small chunks
Sergey Alexandrovich Kryukov 4-Mar-14 16:29pm    
Then you can do it in small chunks... Will you accept the answer formally then (green "Accept" button)?
—SA
DepiyaReddy 4-Mar-14 18:37pm    
can you give an example of how i can do in chunks
Sergey Alexandrovich Kryukov 4-Mar-14 21:35pm    
As I say, it makes little sense without knowing your purpose. First of all, I would generate files in chunks, maybe, to different directories. I hope you understand how could eliminate having too much data at a time. But I don't want to waste time on speculation, unless you share all the ultimate goals of it. I would rather think at elimination of mass creation of files at all.
—SA

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900