I have a lot of backup files with names like Corn-Curl-Server_Backup_2002-02-23.tar.gz
that makes sense as [hostname]_[comment]_[date of the backup].tar.gz
It is easy to scan it with a human eye and discover the backups older than 28 days so they can be deleted, however the files are in separate folders under separate users on the NAS. I can look at any of the folders and delete the ones with an encoded date older than my N value.
I thought I could do something with the date file accessed last but the file dates are not always the dates created based on moving the bunch of them around together and singularly.
I want to parse the date out of each filename and if the date is older than N, I want to move it to colder storage on another volume.

I have a couple of ways to get the timedelta to work, I am most concerned with being able to take filename above and get the date out, turn it into a series that I can load into y, m, d arguments for datetime.date

stale_date = 28
	y = 2012
	m = 1
	d = 23
	datetime.date.today()
	#datetime.date(2012, 2, 14)	
	dd = datetime.date.today() - datetime.date(y, m, d)
	print dd
	
	#datetime.timedelta(31)
	if dd < datetime.timedelta(stale_date):
		print "Young"
	else:
		print "Stale"
This article has been dead for over six months. Start a new discussion instead.