Click here to Skip to main content
Rate this: bad
good
Please Sign up or sign in to vote.
See more: C++ MFC Internet
Hi All,
 
I am developing a file download utility using MFC. This utilty uses the CHttpFile class to download the file in chunks of 5096bytes. Sometimes due to network outage the download fails. However after gaining network connectivity I want the utility to resume download from where it was interrupted. The utility knows how many bytes it has downloaded previously and calls the Seek() method to reset the pointer while resuming download. But the issue is after calling Seek() method and then calling the Read() method takes a lot of time to return. The return time is longer if the file downloaded previously is around 20MB approx and it increases proportionally as the file size increases. Any idea how to resolve this??
 
NOTE: Upon resuming download, I open a new internet session. Could this be the culprit?
 
Thanks in advance...
Posted 20-Sep-12 4:54am
Sunil P V3.2K
Comments
JackDingler at 20-Sep-12 11:12am
   
It sounds like the problem is likely with the web server.
 
Run netmon or a similar utility to track down where the lag is occurring.
Sergey Alexandrovich Kryukov at 20-Sep-12 12:07pm
   
This is more likely than other reasons. Good advice.
--SA
pasztorpisti at 20-Sep-12 12:04pm
   
I have no experience with CHTTPFile but I've done file download via HTTP with own socket code. I did the job by specifying a Range header in my HTTP request: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35.2
The Range header isn't supported by all HTTP servers so your get request might return the whole file even if you specified a range header but you can check the status code: the request returns 206 if the range header was handled, returns 200 otherwise with the full content of the file.
Just guessing but one possible problem can be that the server doesn't support the range header and your CHTTPFile object simulates the seek by downloading the whole file.
sunilkpv at 21-Sep-12 1:14am
   
Well sorry if I wasn't clear. These are my conditions:
- The file will be hosted in a webserver
- Server doesn't have any logic to control the download parameters.
- The download client connects to this URL (hosted file) and starts reading this file in 5096 bytes every second.
- If the network connection is lost then the client utility saves to registry the total bytes read.
- Once network connection is back, the client resumes downloading from the point where it lost connection. The logic that I have implemented to determine resuming is:
1) Get from registry the number of bytes (say N) read previously.
2) Call the CHttpfile::Seek() method to seek to position N bytes of the source file
3) Continue calling the CHttpfile::Read method to continue download from that N position onwards.
 
From point 3 the CHttpfile::Read takes a long time to read 5096bytes. If N value is around 20Mb then CHttpfile::Read takes even longer say around 10minutes.
 
This is the issue. How can this be resolved? Any pointers?
sunilkpv at 21-Sep-12 1:16am
   
Also the to mention the file to be downloaded is around 50Mb.

1 solution

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

  Print Answers RSS
0 Sergey Alexandrovich Kryukov 265
1 _Amy 235
2 Peter Leow 205
3 Andreas Gieriet 180
4 Dave Kreskowiak 130
0 OriginalGriff 7,540
1 Sergey Alexandrovich Kryukov 6,412
2 Maciej Los 3,849
3 Peter Leow 3,653
4 CHill60 2,712


Advertise | Privacy | Mobile
Web04 | 2.8.140721.1 | Last Updated 20 Sep 2012
Copyright © CodeProject, 1999-2014
All Rights Reserved. Terms of Service
Layout: fixed | fluid

CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100