The script I have been working on cleans up ftp images from a bunch a cameras and does this everyday. It deletes directories older than 14 days. The issue I am running accross is that even though there is typically 15k images in each directory, it takes WAY to long to accomplish this. I have tried the following commands and it continues to get bogged down....

fname1 would be the name of the directory.

shutil.rmtree(fname1)

or

cmd_string = "rmdir /s /q %s" % fname1
os.system(cmd_string)

The storage device is a 2 tb raid array on windows server 2008 R2. The memory usage it at 47% and the cache file has been capped 4 gbs (before I capped it, it would take up all the memory). Is there a more efficent way to accomplish this task?

Recommended Answers

All 2 Replies

How you find the directory names? You are issuing the command to highest level old directionary only not for individual files, don't you? fname1 is misleading name for path name.

import os, shutil, datetime, subprocess, sys

fname = raw_input('Please Enter Folder Name: ')
os.chdir(fname)

dirlist=os.listdir(".")  
for fname1 in dirlist:
    if os.path.isdir(fname1) == True:
        Dir_date = datetime.datetime.strptime(fname1, fm)
        Del_time = current_time - datetime.timedelta(days=14)
        if Dir_date <= Del_time:
            Delete Folder

Here is the way I am implementing the code. I left the other checks, logging and timing commands out.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.