Click here to Skip to main content
15,898,134 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Good day all,

I've been working on a project at work that transfers a users files from their old laptop or pc to their new one. WCF is used to transfer the files via a cross-over cable, or other network connection. We've been trying to do some calibration tests in order to predict the estimated copy time based on number of files and size of data. (Also accounting for hardware). Something we've noticed is that we receive much slower transfer speeds the first time the tool is run. Then after that it's almost as though some caching has taken place in the background, meaning that it gets through the same data (or different data) in a much faster time. Even re-booting the machine doesn't avoid this discrepancy, only re-building it from scratch does. Does anyone have any good thoughts on what might be occurring here, and how to avoid it? 99% of users will only ever run it once, hence they will always receive the slower speed. What I'd like to know is whether there is some cached memory I can release to reset the machine to a "first-run" state. Or is there something else I can do?



Thanks for your time people,



Jib
Posted
Comments
Sergey Alexandrovich Kryukov 5-Jul-11 5:52am    
How come you think this needs a new project? What's wrong with existing software?
--SA
Jibrohni 6-Jul-11 4:02am    
We considered using the Windows utility, but this allows us to perform customisation specifically catering for our business needs. (It handles our specific file structures and files differently). Also allows us to change it further down the line where it might be necessary.
Sergey Alexandrovich Kryukov 6-Jul-11 15:07pm    
Thank you for the explanation, but I still don't see the benefits of you project. Perhaps this transfer is application-specific in a certain production environment? In general case, the users should simply segregate there own files and transfer them all.
--SA
Jibrohni 6-Jul-11 17:14pm    
We're a global company with 50,000+ users world-wide. In the ideal world I agree with you, but sadly I suspect not even 1% of the users are capable of managing all of their data perfectly (or at all). They keep their main work files in their "Documents" folder, however many of them use bespoke applications that will write necessary files elsewhere. We have a huge file exclusion list, along with folder exclusions. We handle .pst files very differently to regular user files and re-attach them after the data migration. It's effectively written so that the users don't have to be torn away from their day jobs to deal with this task.
Sergey Alexandrovich Kryukov 8-Jul-11 0:46am    
Thank you for the explanation.
--SA

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900