So, I wrote a little script to convert tifs into jpegs. The tifs are about 600mb in size, I reduce the size, and then convert. Here is the code in its entirety:
## First we import the image library, Image object, and garbage ## collector, as well as os from PIL import Image import gc, os ## Get the path for the directory to be converted, then change to that path print "Please enter the path to the files that you wish to convert." path = raw_input() os.chdir(path) ## For each file in every subdirectory, see if it's a tif file for root, dir, files in os.walk(path): for name in files: if name[-4:] == ".tif": print 'Opening ' + name os.chdir(path) im = Image.open(root + '/' + name) x, y = im.size ## Resize the tiff tile at a 2/3 scale. Make a new directory to ## mimic the file heiarchy of the original file, only on the C ## drive instead of wherever it was to begin with. print 'Resizing' im2 = im.resize((int(x*.66), int(y*.66)), Image.ANTIALIAS) n = 'c' + root[1:] savedfile = n +"/jpegs/" try: os.makedirs(savedfile) os.chdir(savedfile) except WindowsError: os.chdir(savedfile) savedfile = name[:-4] + ".jpg" ## Save the file as a jpg, with a high quality print 'Saving' im2.save(savedfile, quality=85) del im del im2 ## Force a memory dump. Otherwise memory will get cluttered ## up, very quickly too. gc.collect() print 'Memory Wiped'
Due to space limitations in the workplace, I need it to copy the directory structure onto the c drive of the computer the script is running off of, hence the funky directory dance. I'm still kinda new to all of this, and I thought the garbage collector would help me out. But, it'll wipe the memory at first, but things will still accumulate. After converting about 30-40 images (and I need to convert literally tens of thousands) the program crashed with a MemoryError. Page File usage was at about 2G. There is about 3 gigs of ram on this PC, which is running XP. Any tips or advice?