Click here to Skip to main content
15,882,315 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I have a folder containing 1000+ document folders; each doc folder has 1 or more versions of the document (can be excel, word, html types).

How can I use PowerShell to copy only out the most current version from the sub-folders into a single folder so that I can perform PowerGREP searches?

After I get the initial folder of files; is there an easy way to maintain it?

Any help is greatly appreciated!

What I have tried:

Searched Google & looked at this thread on Code Project: File copy using Microsoft Powershell
Updated 3-Dec-18 2:02am
Mohibur Rashid 27-Nov-18 0:14am    
What is the rule to identify a latest file?
Member 14068043 27-Nov-18 8:32am    
The rules are these:
-if the folder is empty, continue (this means a new doc is being created)
-if the folder has 1 file in it, copy it out. I don't have a begin date.
-if the folder has multiple files, copy out the one that was last saved.
-if the folder is labeled "Archived" - skip that folder & its sub-folders.

There is no consistency in the naming of the files. There are current files that have a timestamp appended to the end of the filename, most do not. That is something that is more recent.

I could manually pull a copy of the current files {kringing}, then weekly use Windows search to generate a list of new file names to pull with Powershell? I would have to manually remove the old ones.

But would that be easier to figure out? Thank you SOOO much for taking a look at this :-)
Mohibur Rashid 27-Nov-18 17:50pm    
Have you considered comparing timestamp of last Chang?
Member 14068043 27-Nov-18 18:46pm    
Can that be used if the file name itself is different?
Mohibur Rashid 27-Nov-18 20:07pm    
That would be very difficult if file name is different. Specially in case of binary file. Have you think of file management system (I know your current thought is powershell), where user can upload latest file under a key, it would not be matter what the file name is going to be or even in cases, it would not matter if the file type get changed.

1 solution

A co-worker found this helpful site: Topic: Return date of most recently modified file in each sub-dir |[^]

He wrote the following solution. It doesn't exclude folders, but does traverse folders and copy out the most current file:

# Identify the folder you want to search
$projectsFolder = 'C:\Users\swidene\Documents\DELETE\ACE Enhancement'
# Get the subfolders of the folder being searched and find the most recently modified file
# Sort those files by last modification date in descending order
#copy them to new folder and select the Name, LastWriteTime, and FullName properties and write them to the csv #file
Get-ChildItem -LiteralPath $projectsFolder -Directory -Force | ForEach-Object {
# Get all files under each subfolder, sorted by the last modification date in descending order, select the first one (the most recently modified file) and return it
Get-ChildItem -LiteralPath $_.FullName -File -Recurse -Force | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
} | Sort-Object -Property LastWriteTime -Descending |
copy-item -Destination ("C:\Users\swidene\Documents\DELETE\Backup")
Share this answer

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900