I want to disallow certain query strings with robots.txt and I want to check I am doing this correctly...

I am using:

Disallow: /browse.asp?cat=*-*

I want to check that this rule will allow these urls to be indexed:


While disallowing these urls:


Will this rule work? or will the wild card cause it to dissallow all the cat query strings?


Recommended Answers

All 2 Replies

Query strings aren't mentioned in the spec, so if you're lucky a crawler will do something with it, but don't count on it. Perhaps it can be done with rewrites, but am unsure how to get that working (if at all).

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, learning, and sharing knowledge.