Click here to Skip to main content
Rate this: bad
good
Please Sign up or sign in to vote.
See more: Append command copy
i am using copy /b command to append two diff files in one file. the problem i am experiencing is that this command is not working if the first file is of size larger than 1 gb. i experienced it by myself and i am not finding any references or help from net about this. but this is what happening, i am using command like this
 
copy /b file1+file2 outputfile
 
if file1 is greater than 1gb , the command says 1 files copied but file is not copied. i searched alot but no one has stuck in this issue before plz i need help about this issue, how to solve it or is there any software or something that can do the same work but on larger files. if any1 need some other info to answer this question plz feel free to ask Regards
Posted 28-Jun-13 9:07am
Edited 28-Jun-13 9:19am
Ron Beyer58.7K
v2
Comments
Zoltán Zörgő at 28-Jun-13 14:26pm
   
Might happen. Copy is coming from the really old command.com internal command set. I can imagine, there is a variable in the code that the guys from Microsoft have forgotten to "update" to some 64 bit one. You might have found a bug. I suggest you report to them.
But my question is? Can't you use something else?
hinaheed at 29-Jun-13 1:00am
   
Yes i can use something else and I will be glad to switch to that, but only if i knew something. if you know any alternative for doing the same action kindly share. :)
Zoltán Zörgő at 29-Jun-13 1:39am
   
Well, since this is a programming forum, I suggest you make your own code.
In c# you can start here: http://stackoverflow.com/questions/3556755/how-to-merge-efficiently-gigantic-files-with-c-sharp
If you don't want to use .net and compile code, just use vbs: http://code.huypv.net/2013/05/vbs-merge-binary-files.html
You could even use PowerShell: http://stackoverflow.com/questions/1783554/fast-and-simple-binary-concatenate-files-in-powershell
hinaheed at 29-Jun-13 2:04am
   
actually yes i have embedded this command into my c# application and that will be more appropriat if direct coding is done instead. first link seems to be helpful to me. I'll share my results after trying that. Thnkx, Regards and stay blessed :)
hinaheed at 29-Jun-13 3:04am
   
I tried this code :
 
private void button1_Click(object sender, EventArgs e)
{
// then use as follows (do in a loop, don't forget to use using-blocks)
Stream yourInputStream = new System.IO.FileStream("C:\\test\\zip.zip", FileMode.Open);
Stream yourOutputStream = new System.IO.FileStream("C:\\test\\test.mp4", FileMode.Append);
 
CopyStreamToStream(yourOutputStream, yourInputStream);

}
 

void CopyStreamToStream(Stream dest, Stream src)
{
int bytesRead;
 
// experiment with the best buffer size, often 65536 is very performant
byte[] buffer = new byte[999999];
 
// copy everything
while((bytesRead = src.Read(buffer, 0, buffer.Length)) > 0)
{
dest.Write(buffer, 0, bytesRead);
}
}
 

exactly doing the same work as copy /b command, but same issue is also here. if the output file is greater than 1 gb i cant open it using winrar to view the embedded file as i can view in the file less than 1 GB. I think, the problem is not with copy /b command or this code. because filesize increases exactly with proportion, so problem may be during opening files with winrar. Seems that i can embed file but cant extract it later :( Im stuck
Rate this: bad
good
Please Sign up or sign in to vote.

Solution 1

I think you are doing something wrong: I just tried it by copying a 1.01GB file into a "spare" folder twice and renaming them to file1 and file2.
Then in a command prompt I changed to the appropriate folder, and entered:
copy /b file1+file2 outputfile
It took a little time, but it worked fine and I ended up with two 1.01GB files, and 1 2.02GB file.
 
So what am I doing that is different from you?
  Permalink  
Rate this: bad
good
Please Sign up or sign in to vote.

Solution 2

You are correct for what you are saying.. yes, ofcourse the file size increases in the proper manner. my actual problem is that I am trying to embed a zip folder with some files in it and trying to append it with a file( which is more than 1gb).
like : copy /b file1(large file)+zipfile outputfile
if file1 is less than 1 gb, I can open outputfile using winrar and can view the files that was previously in the zip folder.
but, in case if file1 is greater than 1GB, if i choose to open outputfile using winrar it prompts that "No archives found". thats my major problem.
  Permalink  
Rate this: bad
good
Please Sign up or sign in to vote.

Solution 3

Well.. I figured out that the problem I was thinking wasn't even the problem. no problem with copy /b command, no limit issues. It was winrar. So I figured out to work without using winrar. Thankx for replies Smile | :)
Kind Regards
  Permalink  
Comments
Zoltán Zörgő at 29-Jun-13 11:32am
   
Than you have not shared with us a really important aspect.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

  Print Answers RSS
0 OriginalGriff 350
1 Jochen Arndt 190
2 Richard MacCutchan 135
3 Sergey Alexandrovich Kryukov 130
4 DamithSL 105
0 OriginalGriff 6,045
1 DamithSL 4,601
2 Maciej Los 4,087
3 Kornfeld Eliyahu Peter 3,480
4 Sergey Alexandrovich Kryukov 3,310


Advertise | Privacy | Mobile
Web01 | 2.8.141220.1 | Last Updated 29 Jun 2013
Copyright © CodeProject, 1999-2014
All Rights Reserved. Terms of Service
Layout: fixed | fluid

CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100