I want to disallow certain query strings with robots.txt and I want to check I am doing this correctly...
I am using:
Disallow: /browse.asp?cat=*-*
I want to check that this rule will allow these urls to be indexed:
browse.asp?cat=123/1234/1234-1
browse.asp?cat=123/1234-1
While disallowing these urls:
browse.asp?cat=1234-1
browse.asp?cat=1234-2
Will this rule work? or will the wild card cause it to dissallow all the cat query strings?
Thanks.