I would like to maximize the PR on my forum display and thread display pages, without sacrificing PR to less important pages. (for instance, the newthread.php page)

Now suppose there are 5 links on page A. Normally, page A's PR will be spread out to the 5 pages. But now if I added a robots.txt file which blocked indexing of two of the links. Would page A's PR now be spread less thin among 3 pages? Or would it be spread just as thin, but 2 of the pages would be entitled to a share of PR but just wouldn't use it.

I hope this sorta made a bit of sense.

Re: Using a robots.txt to block links 80 80

I sent you a PM RE: this topic. :)

Re: Using a robots.txt to block links 80 80

And I started this thread RE: your PM :)

Re: Using a robots.txt to block links 80 80

Ok I am a knuckelhead. I guess this was a chicken before the egg thing. ;-)

Re: Using a robots.txt to block links 80 80

The robots.txt file I'm currently using is in my root directory (not my forum root) and looks like this:

User-agent: googlebot
Disallow: /techtalkforums/announcement.php
Disallow: /techtalkforums/faq.php
Disallow: /techtalkforums/forumdisplay.php
Disallow: /techtalkforums/login.php
Disallow: /techtalkforums/member.php
Disallow: /techtalkforums/newreply.php
Disallow: /techtalkforums/newthread.php
Disallow: /techtalkforums/online.php
Disallow: /techtalkforums/printthread.php
Disallow: /techtalkforums/search.php
Disallow: /techtalkforums/showthread.php

I am disallowing access to showthread.php and forumdisplay.php because I would rather Google only spider the .html mod_rewrite versions of the forums and threads, and therefore not get duplicate content. Was this done correctly? Am I excluding the correct things?

Re: Using a robots.txt to block links 80 80

without going too far in, it looks decent to me. I am not sure however, if the robots.txt blocks the weakening link pop from all the links. The less links on a page the more potent the links are. A page with tons of links is spreading the pop thin. That would be a good question to ask SEO-Guy.

Re: Using a robots.txt to block links 80 80

It would be very nice if the robots.txt would block the weakening spread of PR. However, even if it doesn't do this, it would still be valuable because it would eliminate spidering duplicate content (i.e. showthread.php?t=10 and thread10.html)

Re: Using a robots.txt to block links 80 80

It would be very nice if the robots.txt would block the weakening spread of PR. However, even if it doesn't do this, it would still be valuable because it would eliminate spidering duplicate content (i.e. showthread.php?t=10 and thread10.html)

*nods* for sure.

Re: Using a robots.txt to block links 80 80

I don't see how this will help at all.

Re: Using a robots.txt to block links 80 80

Google frowns upon multiple pages with the same content. For example, if two different URLs have the exact same content on them, google considers it spamming their search engine. This forum uses Apache's mod_rewrite to rewrite URLs to have a .html extension for search engine purposes. Therefore, the webpage showthread.php?t=100 is the exact same thing as thread100.html - if google spiders see this duplicate contact, they will think that daniweb.com is trying to inflate its page count in google by having multiple URLs with the same content. However, by using robots.txt to block google from spidering the showthread.php pages, google only spiders the pages ending in .html - and therefore doesn't penalize us for duplicate content.

Re: Using a robots.txt to block links 80 80

may i aks how you changed it to thread6988.html instead of showthread?

Re: Using a robots.txt to block links 80 80

may i aks how you changed it to thread6988.html instead of showthread?

It's done using a technique called url re-writing.

On this server, the page thread6988.htm does not physically exist. Instead, the web server monitors incoming url requests and looks for the word thread in that request..... if so, it grabs the numbers from that and passes it along to showthread.php easy enough.

Hope this helps.

Re: Using a robots.txt to block links 80 80

Hi csgal,
can you post the new robots.txt here, please?

Thank you.

Re: Using a robots.txt to block links 80 80

Would it make sense to use the robots no follow tag in your particular case?

Re: Using a robots.txt to block links 80 80

Yes, i need the robots.txt from daniweb for my vbulletin forum: http://www.schachfeld.de/

Where can I find the robots.txt?

Re: Using a robots.txt to block links 80 80

Can you send me the robots.txt to Matzefn1@web.de?
Thank you very much.

Matzefn1

Re: Using a robots.txt to block links 80 80

Can I have the robots.txt, please.

Re: Using a robots.txt to block links 80 80

Post #5 shows the robots.txt file that I used to use. I no longer use a robots.txt file.

Re: Using a robots.txt to block links 80 80

Post #5 shows the robots.txt file that I used to use. I no longer use a robots.txt file.

Why do you no longer use a robots.txt? Google frowns upon multiple pages with the same content...!?

My robots.txt file: http://www.schachfeld.de/robots.txt

Re: Using a robots.txt to block links 80 80

We had a problem where pages that had a no-crawl code at the root directory still were being crawled (they were PDFs that had valuable IP in them).

We discovered that the bots were getting in through links on other pages of ours (the PDFs are "samples" of products that we use as marketing tools), so we put "no follow" codes -- <meta name="robots" content="index,nofollow" /> -- on those pages.

This let's the spider index the page but not follow the links on the page.

But if someone includes a link to the non-HTML thread in a page that you don't control, do you think it will bypass your html rewrite?

Re: Using a robots.txt to block links 80 80

My favorite megaupload files search engine is http://megauploadfiles.com
it’s the most powerful and easy to use.

Re: Using a robots.txt to block links 80 80

Sometimes, even after adding the no index tag to the pages, it will take some weeks before search engines know exactly what you mean.

Re: Using a robots.txt to block links 80 80

Aren't robot files bad for tracking purposes, like a no follow? I guess I'm new to this kind of discussion..lol.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of 1.19 million developers, IT pros, digital marketers, and technology enthusiasts learning and sharing knowledge.