My question is hopefully particular enough to not relate to any of the other ones that I've read. I'm wanting to use subprocess and multiprocessing to spawn a bunch of jobs serially and return the return code to me. The problem is that I don't want to wait() so I can spawn the jobs all at once, but I do want to know when it finishes so I can get the return code. I'm having this weird problem where if I poll() the process it won't run. It just hangs out in the activity monitor without running (I'm on a Mac). I thought I could use a watcher thread, but I'm hanging on the q_out.get() which is leading me to believe that maybe I'm filling up the buffer and deadlocking. I'm not sure how to get around this. This is basically what my code looks like. If anyone has any better ideas on how to do this I would be happy to completely change my approach.
def watchJob(p1,out_q): while p1.poll() == None: pass print "Job is done" out_q.put(p1.returncode) def runJob(out_q): p1 = Popen(['../../bin/jobexe','job_to_run'], stdout = PIPE) t = threading.Thread(target=watchJob, args=(p1,out_q)) t.start() out_q= Queue() outlst= proc = Process(target=runJob, args=(out_q,)) proc.start() outlst.append(out_q.get()) # This hangs indefinitely proc.join()