Whats is the best statergy in writing a sitmap.xml and robots.txt?

Recommended Answers

All 3 Replies

You can use tools for creating the best site maps. Here are some tools.

It completely depends on what type of website you have. Is it a static site with fewer than 50 pages? Or is it a dynamic site with millions of pages? If there are fewer than 50 pages, I strongly urge you to handwrite your sitemap file. If it's a dynamic site, then your best bet is to code a script to generate your sitemap (having database access is a must when determining which pages to index on a dynamic site.)

Once you have a strategy in place, then figure out what pages you want included in the sitemap, what pages to noindex, and what pages to exclude bots from crawling. This is something best left to experienced SEOs. If you're just getting started, we could help you with some ideas, or answer questions you may have, but I urge you to put a lot of thought into what goes here, because you could easily shoot yourself in the foot, and wind up deindexing all your pages from Google.

That's why I am very against using automated tools as AndreRet suggests. I think that hand crafting a sitemap.xml and robots.txt file are super important, and each line should never be taken lightly.

commented: You are totally correct Dani, if the OP gave us more information and some sign of effort, I would have elaborated as well. +14

Just keep the pages which you want to be indexed in google & remove the rest from sitemap. Robots.txt I follow the simple one unless there are some unnecessary query pages indexed.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.