Click here to Skip to main content
15,904,288 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
I have to work on an application like Website Downloader which should allow user to download full website (all the files of the website) to a specified location on the disk through its URL. Currently I am just able to get only html of Home Page using WebRequest and WebResponse classes. I have tried another solution using

WebClient Client = new WebClient ();
Client.DownloadFile("http://www.examplesite.com/Default.aspx", "D:\\Downloader\\Default.aspx");

But here I have to place file with same name on physical location as to be downloaded and I am also unable to download CSS and JS being using on the page. I also want to allow my applicaiton to include built in browser in order to browse downloaded file in offline mode.

Please help me in this regard. Thanks in advance for your kind guidance.
Posted
Updated 17-Nov-13 7:54am
v2
Comments
Richard MacCutchan 17-Nov-13 12:31pm    
There is a reason that you cannot do this.

1 solution

Strictly speaking, "full Website" is the concept which is not valid for all sites. Some pages are purely interactive or random, so, even if they can be copied somehow, the copy could be different each time; and, in other cases, the page itself can only be generated as a result of the user's activity. The examples of such pages are not even too rare; one typical case is games.

As to the rest of it, you need is some techniques of Web scraping:
http://en.wikipedia.org/wiki/Web_scraping[^].

Please see my past answers for further information:
get specific data from web page[^],
How to get the data from another site[^].

—SA
 
Share this answer
 
v2

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900