What is spider? And how to work spider?

Recommended Answers

All 11 Replies

You dont work a spider
'spider' in this instance, is another name for a search engine robot, because they crawl the web following links
Your task is to optimize your site to make access better for the spider when it gets therefix broken links
use a consistent file structure
use valid code
sitemapgo to the google webmaster tools page and read webmaster101
also signup your site verify ownership of it, and find out exactly what spiders see when they crawl your site
makes the 'optimize' bit above, a lot easier
these standard test beds may assist you

http://analyze.websiteoptimization.com/authenticate.php?url=http://www.yoursite.com&/ Speed

http://validator.w3.org/check?uri=http%3A%2F%2Fwww.yoursite.com&charset=%28detect+automatically%29&doctype=Inline&group=0 html

http://jigsaw.w3.org/css-validator/validator?uri=http%3A%2F%2Fwww.yoursite.com&profile=css21&usermedium=all&warning=1&lang=en CSS2 http://jigsaw.w3.org/css-validator/validator?uri=http%3A%2F%2Fwww.yoursite.com&profile=css3&usermedium=all&warning=1&lang=en CSS3

http://demo.opera-mini.net/demo.html?www.yoursite.com handheld

http://www.browsershots.org other browsers
many problems (if present) will show
serious code errors in the w3c validator sites produce blankscreens in browsershots

Valid code does not ensure the site will work ..... in all browser OS combinations
Invalid code ensures the site will not work ..... in all browser OS combinations

not all layouts work in handheld devices
strictly code based

Spider is a programme that access the web page. It's also called webcrawler.Spider build the list of word, encode data, build index and store data for the user.

Spider is an bot and it worked as a program which follows links and through those links it collects meaningful text and updates from web.
It is used to index data with relevancy in search engine repository.

Spider crawls any website and returns the number of pages on the server, the number of pages indexed by Google, link popularity, and Alexa rank, and a summary report with search engine ranking stats. It is also known as robot or bot.

It just like a bot software which is use in website's pagerank.

Crawlers or spiders is the process by which Googlebot discovers new and updated pages to be added to the Google index.

Spiders are acclimated to feed pages to search engines. It's alleged a spider because it crawls over the Web. Another term for these programs is webcrawler.

Spider is a software program used by all the search engines which crawls the all information about the website like content, links , title tag and many more.


Spider is another term used for the search engine crawler. It is a kind of software which follows the links, contents throughout the internet and store them in search engine's database.


spiders are the crawlers of search engines they also called bot they index web pages

Spider in seo world is basically an automated programme for crawl as well as access the sites for indexing in their databases

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.