I have a very strange problem.

I have a python script that uses multi-threading. It works fine, and does exactly what it should.
I use several modules that interact with other files. All file handles are closed as soon as they are on longer needed.

When I run this script locally from my os x server, it runs fine, no problems.

However, when I try to SSH into my box from another box and execute the script, it runs fine for a few minutes.

Then it dies with this error:
IOError: [Errno 24] Too many open files

I know my scripts are juggling several tempfile objects, but it cant be more than 12 at a time.

What do you thinks is causing this? When I SSH into my box I use the same login as when I am physically using it.

Perhaps you could first secure the fact that all the files are properly closed by using only with statements

with open(...) as fh:
    ... # do something with fh
# now we're certain that fh is closed

Otherwise, perhaps the SSH process uses the few too many file descriptors that your system accepts. Google gives many results for "Too many open files" :)

Edit: also notice that tempfile.TemporaryFile() can be used in a with statement

with TemporaryFile(...) as fh: