In practice this is a bad idea - it's too fragile. Someone may publish a link to your files on their site. Or it may turn up in a publicly accessible log file, of user's proxy server, or it will show up in someone's web server log as a Referer.
/robots.txt is not intended for access control
it's a "No Entry" sign, not a locked door.
If you have files on your web site that you don't want unauthorized people to access, configure your server to do authentication, and configure appropriate authorization. Basic Authentication has been around since the early days of the web (and in Apache on UNIX is trivial to configure).htaccess
content management systems support access controls on individual pages and collections of resources