We have the valid URLs:
www.daniweb.com/foo www.daniweb.com/foo/ www.daniweb.com/foo/1 www.daniweb.com/foo/2 www.daniweb.com/foo/3
If I want to disallow them all in robots.txt, are both of these valid and will they do the same thing?
Disallow: /foo Disallow: /foo/
Will the latter also block the URL www.daniweb.com/foo or will that be interpreted as a page underneath the root directory, and not within the foo directory? Contrastly, will the former be interpreted as only blocking the single page and not the foo directory?