0

The script I have been working on cleans up ftp images from a bunch a cameras and does this everyday. It deletes directories older than 14 days. The issue I am running accross is that even though there is typically 15k images in each directory, it takes WAY to long to accomplish this. I have tried the following commands and it continues to get bogged down....

fname1 would be the name of the directory.

shutil.rmtree(fname1)

or

cmd_string = "rmdir /s /q %s" % fname1
os.system(cmd_string)

The storage device is a 2 tb raid array on windows server 2008 R2. The memory usage it at 47% and the cache file has been capped 4 gbs (before I capped it, it would take up all the memory). Is there a more efficent way to accomplish this task?

2
Contributors
2
Replies
3
Views
5 Years
Discussion Span
Last Post by treyb
0

How you find the directory names? You are issuing the command to highest level old directionary only not for individual files, don't you? fname1 is misleading name for path name.

Edited by pyTony

0
import os, shutil, datetime, subprocess, sys

fname = raw_input('Please Enter Folder Name: ')
os.chdir(fname)

dirlist=os.listdir(".")  
for fname1 in dirlist:
    if os.path.isdir(fname1) == True:
        Dir_date = datetime.datetime.strptime(fname1, fm)
        Del_time = current_time - datetime.timedelta(days=14)
        if Dir_date <= Del_time:
            Delete Folder

Here is the way I am implementing the code. I left the other checks, logging and timing commands out.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.