bellasteward
0
Newbie Poster
Recommended Answers
Jump to PostRobots.txt is used to guide search engine through your website.
if you wish that a specific folder/webpages should not be crawled by an particular search engine. then robots.txt can be helpful for you.Robots.txt can control the excess of any search engine through out your complete website.
Jump to PostNote that some search engines can not search certain sites unless the robots.txt file is present.
All 5 Replies
surindersharma
0
Junior Poster
bellasteward
0
Newbie Poster
stephaniemcgrat
0
Newbie Poster
bellasteward
0
Newbie Poster
MidiMagic
579
Nearly a Senior Poster
Be a part of the DaniWeb community
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.