Hi Everyone,
I m trying to built a Web Vulnrebilty Scanner which will test CGI/RFI/XSS/ on a remote Web Server.Here are my baby steps for do this project.
1-Discover pages in remote site.
(ex: www.google.com , www.google.com/x.html google.com/y.html ...)
2-Post some scripts to the link and test the response.
3-Read the source code of html files and search some sytax in it.
I need your help on 3 topic above. Thank you so much.

@mn_kthompson
So thanks for your link.
Do you know a way to discover internal links on web site like "/admin.php, /x.htm /y.html" ?

That doesn't seem super easy, and I doubt that there is a built in way to do it. I think I would try a combination of two things.

The first thing I would try doing is taking some common file names and giving them a try. Kind of a smart brute force attempt. Try looking for admin.php, login.php, etc.

the other thing I would look at doing is checking the links that show up on each page. Pull in the html for the target sites main page, then extract every <a href=> tag on the page. Check if the anchor leads to some page in the same domain (we don't want to bang away on external links) and if so, then run your XSS checks against it.

If you're running this from a linux platform, you might want to look at spawning an instance of wget to make your coding a bit easier. You could use this command to recursively download a web site going two levels deep:
wget -r -l2 www.yoursite.com

Hope that gives you some ideas to run with.

This article has been dead for over six months. Start a new discussion instead.