Hello !

I want to develope a website downloader . which can download entire website on my local drive. i manage to download html code of any page and then save it as .html . but this is not good. can anyone give me any idea to complete my task.

Regards

Recommended Answers

All 3 Replies

Once you have the code parse it for links and store any links from the same site, download each page, and repeat the same procedure for each page.

but i am not able to download css , js files , and images of any page , it just give me inner html of the page , here is my code

string url = "http://google.com";
string strResult = "";

WebResponse objResponse;
WebRequest objRequest = System.Net.HttpWebRequest.Create(url);

objResponse = objRequest.GetResponse();
using (StreamReader sr = new StreamReader(objResponse.GetResponseStream()))
{
strResult = sr.ReadToEnd();
// Close and clean up the StreamReader
sr.Close();
}

// Display results to a webpage
MessageBox.Show(strResult);

i want to save webpage on my local drive just like we use save as option of any browser.

Regards

You might want to look at the webbrowser control and htmldocument. That will give you access to all the parts of the webpage

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.