Click here to Skip to main content
11,924,800 members (54,744 online)
Click here to Skip to main content
Add your own
alternative version


126 bookmarked

Duplicate Files Finder

, 15 Dec 2008 CPOL
Rate this:
Please Sign up or sign in to vote.
A utility to find any duplicate file in your hard drives using MD5 hashing.


Search results


File deleted


Once a year, I do that terrific job of cleaning files I created or downloaded on my drives. The last time I tried to do it, it was such a fastidious task that I thought of doing that thing semi-automatically. I needed some free utility that could find duplicate files, but I found none that corresponded to my needs. I decided to write one.


The CRC calculation method is available here. I use the MD5 hashing provided by the standard libraries. I added an event to the MD5 computing method so as to get a hashing progression, it is a thread that reads the stream position while the MD5 computing method is reading the same stream.

Using the code

The utility uses two main classes, DirectoryCrawler and Comparers. The use is obvious Smile | :) Please notice that instead of iterating through a list list.count X list.count times, DuplicateFinder uses a Hashtable that contains the pair <size,count>. Once populated, all files with count =1 will be removed: (Very much faster!!!!)

int len = filesToCompare.Length;
List<long> alIdx = new List<long>();
System.Collections.Hashtable HLengths = new System.Collections.Hashtable();
foreach (FileInfo fileInfo in filesToCompare)
    if (!HLengths.Contains(fileInfo.Length))
        HLengths.Add(fileInfo.Length, 1);
        HLengths[fileInfo.Length] = (int)HLengths[fileInfo.Length] + 1;
foreach (DictionaryEntry hash in HLengths)
    if ((int)hash.Value == 1)
        setText(stsMain, string.Format("Will remove File with size {0}", hash.Key));
FileInfo[] fiZ = new FileInfo[len - alIdx.Count];
int j = 0;
for (int i = 0; i < len; i++)
    if (!alIdx.Contains(filesToCompare[i].Length))
        fiZ[j++] = filesToCompare[i];
return fiZ;

Points of interest

  • (Done) Optimizes file moving, UI may be unresponsive while moving big files Frown | :(
  • (Useless, my MD5 is better ^_^) Add options to choose between CRC32 and MD5 hashing.
  • Maybe use an XML configuration file. At this time, moving duplicate files to D:\DuplicateFiles (which is hard coded, viva Microsoft!) and skipping that folder during scanning is sufficient to me.
  • Don't forget that your posts make POIs.
  • (Done): Code an event enabled MD5 hashing class that would report hashing progression, imagine hashing a 10 GB file!


  • v0.2
    • Optimized duplicates retrieving (duplicate sizes and duplicate hashes).
    • Added Move to Recycle Bin.
    • Added file size criteria.
    • Files to delete info updated for every check/uncheck in listview.
    • Added colors and fonts to UI.
    • Debug enabled sources (#if DEBUG synchronous #else threaded).
    • Added List<Fileinfo> and List<string[]> instead of using array lists.
    • MD5 hashing is used instead of CRC32 (supercat9).
    • Added Skip Source Folder option.
    • Added Drop SubFolder.
    • Some optimizations...
  • v0.1
    • First time publishing. Waiting for bug reports Smile | :)


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

Chief Technology Officer
Morocco Morocco
in his studies, erratum discovered c/c++.he appreciated it.
when he met oracle products, in his job, he fell in love.
he uses c# .net & ms sql.

he created a "f.r.i.e.n.d.s" like soap movie, melting all of the above.
went back in the university.
after he took courses of artificial vision & imagery, he finished his studies with a successful license plate recognition project.

You may also be interested in...

Comments and Discussions

GeneralRe: Duplicate FIle Name finder Pin
supercat915-Dec-08 7:29
membersupercat915-Dec-08 7:29 
GeneralRe: Duplicate FIle Name finder Pin
pyrodood15-Dec-08 8:18
memberpyrodood15-Dec-08 8:18 
GeneralGreat article Pin
final_zero1-Sep-08 10:09
memberfinal_zero1-Sep-08 10:09 
AnswerRe: Great article Pin
eRRaTuM15-Sep-08 17:34
membereRRaTuM15-Sep-08 17:34 
GeneralRe: Great article Pin
final_zero19-Sep-08 10:01
memberfinal_zero19-Sep-08 10:01 
GeneralRe: Great article Pin
eRRaTuM22-Sep-08 5:55
membereRRaTuM22-Sep-08 5:55 
QuestionCaching? Pin
GregSawin1-Sep-08 8:35
memberGregSawin1-Sep-08 8:35 
AnswerRe: Caching? Pin
eRRaTuM15-Sep-08 17:42
membereRRaTuM15-Sep-08 17:42 
GregSawin wrote:
Could the hashes be saved in some cache file so they don't have to be re-computed every time

Of course, just like minimalist antiviruses it should be saved in NTFS stream when available or somewhere if not, but you should be notified when file content change, and then recalculate the hash...

Well it for sure IS feasible as described above... but... is it really the goal of the app? Wink | ;)

GregSawin wrote:
it could be sped up by computing the hashes for multiple files at the same time

Do you mean using a hashing thread pool? Could U explain more?

:: YOU make history ::

GeneralRe: Caching? Pin
Pedro Barreto2-Oct-08 8:10
memberPedro Barreto2-Oct-08 8:10 
AnswerCompute MD5 for everything? Pin
supercat915-Dec-08 13:20
membersupercat915-Dec-08 13:20 
GeneralRe: Compute MD5 for everything? Pin
eRRaTuM23-Dec-08 4:47
membereRRaTuM23-Dec-08 4:47 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

| Advertise | Privacy | Terms of Use | Mobile
Web03 | 2.8.151125.3 | Last Updated 15 Dec 2008
Article Copyright 2008 by eRRaTuM
Everything else Copyright © CodeProject, 1999-2015
Layout: fixed | fluid