Okay, to be more specific, log-in pages to admin sections of a site.
Any other admin type pages, say for a CMS.
Members areas that might include a set of tutorials or pdfs or private info of some sort.
Or are you asking which method to use?
AFAIK - neither will stop robots determined to sweep your entire site - only the well-behaved ones. But there again, you're talking SEO not protection right? Or the other way around?
Your cgi-bin obviously, login pages, form handlers, includes folders, images etc etc. I *usually* set a robots.txt file to disallow quite a few complete folders. HOWEVER, this is publically viewable, so the paths of your hidden files/folders etc can be known. Using the meta tag (nofollow, noindex) is another 'useful' addition.
Ensure that all 'hidden' files are protected so that they cannot be accessed without a secure login etc, e.g. bump the robot/user to a default page if they try to access without authorisation.