0

Hi,

I have a quick question! I have implemeted robots.txt to disallow some of my pages! How long it will take to do this? whether i can get all my pages OUT OF INDEX all at a time or it will happen pages by pages. I'm confused

Ideas are welcome

5
Contributors
4
Replies
5
Views
10 Years
Discussion Span
Last Post by Candy_ME
0

It depends on your site. Sites that are crawled more frequently will see results faster then sites crawled less frequently. If you want pages removed from the SERPs quickly robots.txt is not your best option. Use a removal request instead.

0

when the seach engine crawls your site the first file it hits is robots.txt and if it sees to disallow any files it won't index them which means they should go off there index!

0

when the bot crawls thats the time when it'll see what is allowed and whats not allowed and make sure u use opening slash / and closing slash /

/images/

something like that when not allowing the bot to crawl with the right syntax =)

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.