Hello !

I want to develope a website downloader . which can download entire website on my local drive. i manage to download html code of any page and then save it as .html . but this is not good. can anyone give me any idea to complete my task.


Once you have the code parse it for links and store any links from the same site, download each page, and repeat the same procedure for each page.

but i am not able to download css , js files , and images of any page , it just give me inner html of the page , here is my code

string url = "http://google.com";
string strResult = "";

WebResponse objResponse;
WebRequest objRequest = System.Net.HttpWebRequest.Create(url);

objResponse = objRequest.GetResponse();
using (StreamReader sr = new StreamReader(objResponse.GetResponseStream()))
strResult = sr.ReadToEnd();
// Close and clean up the StreamReader

// Display results to a webpage

i want to save webpage on my local drive just like we use save as option of any browser.


You might want to look at the webbrowser control and htmldocument. That will give you access to all the parts of the webpage

This article has been dead for over six months. Start a new discussion instead.