I have written a script that checks whether a line has been added to a file.
Whenever, a line gets added to the file a thread is created to process that line. And if a certain conditions holds for that read line, the thread should sleep for 1h and process something else again.
Now my problem is, that whenever that condition holds and time.sleep(3600) is executed all other threads are "blocked" as well, i.e. they do not process their read data.
Here are the relevant code fragments:
def fullAnalysis(): i = 0 while 1: fp = file("logger.txt",'a+') hasFileChanged = u.testFileUpdate() if(hasFileChanged): data = u.getFileData() i = i + 1 print data queueLock.acquire() alertQueue.put(data) queueLock.release() # Create new threads thread = ProcessThread(i, "thread_"+str(i), alertQueue) thread.start() threads.append(thread) if(alertQueue.empty()): # Wait for all threads to complete for t in threads: t.join() print "Exiting Main Thread" fp.close()
class ProcessThread (threading.Thread): def __init__(self, threadID, name, q): threading.Thread.__init__(self) self.threadID = threadID self.name = name self.q = q def run(self): print "Starting " + self.name process_data(self.name, self.q) print "Exiting " + self.name def process_data(threadName, q): #while not exitFlag: queueLock.acquire() if not alertQueue.empty(): data = q.get() queueLock.release() print "%s processing %s" % (threadName, data) fp.write("\nqueue size: "+str(q.qsize())) time.sleep(3600) u.getNewDataFromAlertFile() else: queueLock.release()
Is there a way of avoiding this "blocking" of the different threads?
Thanks in advance.