Click here to Skip to main content
15,746,248 members
Please Sign up or sign in to vote.
1.00/5 (2 votes)
See more:

how can i copy two files in one file, supposed that each file contain one column of data, into one file without reading variables and put them in another file, ordinary procedure is two read content of each file and put them into another one, which consume much time when dealing with large data, i mean merge two columns side by side in one file, any creative ideas.

Example Inputs : file 1


Example Inputs : file 2


Desired Output : "resulted file"

1  6
2  5
3  4
4  3
5  2
6  1


What I have tried:

FILE *fptr,*fptr1,*fptr2;
char string1[2000],string2[2000];



    fputs (string1,fptr2);
    fputs (s"\t",fptr2);
    fputs (string2,fptr2);


note: the two files are the same in length
Updated 29-May-16 19:17pm
Maciej Los 29-May-16 17:25pm    
And what have you tried?
aymanshebl 29-May-16 17:28pm    
well, i tried to scanf variables from file one and file two and put data into another file, but i want effective , more speedy way than i do
Sergey Alexandrovich Kryukov 29-May-16 17:40pm    
Who are you talking to? To reply to the question by Maciej Los, you have to reply on his comment, not on your own post, then he will get a notification on your comment.

Apparently, you did not understand the question "What have you tried?" This is more detailed formulation of this question:
What have you tried so far?

What makes you thinking that your approach was too slow? With such small files, you hardly can notice the time spend on this operation.

aymanshebl 29-May-16 17:47pm    
take it easy man,

honestly, i reply to him not my post

i give small example, what i am doing is more than 50000 reading in each file

your help will be appreciated

Sergey Alexandrovich Kryukov 29-May-16 17:54pm    
You manner to ignore comments and make your comments not noticed by people who you are talking to won't get you anywhere. Why not paying a bit of attention — this is really simple?
I repeat: technically, you replied to your own post.

I guess, you will probably get some help when you write what you have tried. Chances are, you already did it all correctly. However, the whole idea of using all those files could be replaced with something more reasonable, but as soon as you really need to do what you do, you hardly can improve it much.


1 solution

you've tagged your solution 'c++', 'c', 'MFC' - yet your code is clearly 'c'

There are a couple of worries with this sort of question - you say more than '50000' in each file - are we to assume they are all 'single' character lines like your examples (in which case why have you defined 2000 long char arrays ?)

It could take a while/experimentation to 'make it faster' - and what do you define as 'fast' or 'slow' ? (you must realise a lot of I/O is happening)

# one approach (that I would use) may be to use c++, std::ifstream and std::list, read the files into memory and then output them using iterators and std::ofstream - the intent being that ifstream and ofstream provide some buffering as to reduce the locks in i/o from disk

# as a 'step back' from the approach above, I wonder if ifstream/ofstream are faster than fgets/fputs ... you could simple upgrade your code in 'c' to 'c++' and try it

There's no point in quoting 'faster' / 'slower' unless you're measuring times though, so you need to get start and end times for your processes/attempts and compare them
Share this answer
aymanshebl 30-May-16 14:41pm    
thanks Garth

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900