Greetings everyone,

I was looking for some help in creating a seemingly simply backup script.

I have two servers, laid out in the same format:
Server A: the directories contain all the iterations of a certain type of file
Server B: the directories contain only the most recent file.
The directories are named the same on each server.

What I want the script to do is look in the directories of Server A, and if the latest file is newer than the file in Server B, then delete the file on Server B and cp the new one over.

I keep getting bogged down in it though with bulky code, and was wondering if anyone had a simpler solution.

Hi madtorahtut!

Have you tried rsync? Rsync does exactly what you're describing, with a very simple command line. Here's an example... If you want to have /data/ on Server A sync'd with /data/ on Server B (deleted files are deleted, changed files are changed, etc...) you would do something like this:

## I'm running this on Server A, the source, and supplying
## Server B as the remote destination

rsync -av --delete-after /data/ user@serverB:/data/

Rsync has a lot of possibilities. You can do incremental backups, or you can use your backup script to keep multiple, dated copies of the data (if disk space allows).

I hope this helps!

Thanks Gromit!

Rsync looks like it would make my life a lot easier. Just out of curiosity, do you know a way for it to copy only the newest file from the directories on serverA to serverB?

That depends on what you mean by "newest"... Rsync actually checks the local and remote file and ONLY sends the file over if it's been update.

That way you aren't sending the entire directory every time, only the files that have changed since the last rsync.

Ah, sorry, I should have specified. I guess an example is in order.

In Server A, the directory listing would be like this:


Then, on ServerB, the directory should only have one file (the newest created) in each subdirectory:


That is the part I am having trouble with.

Oh! So you only want a copy of the latest file in each directory to be backed up. In that case, rsync probably isn't what you want. It'll be a little more complicated than that.

If the files are named with that date format, you'll have trouble sorting them properly based on that, so you'll probably want to use the file creation time instead. You can use 'ls -t' to sort the directory listing and grab the most recent one. I'd recommend trying it a few times to make sure it gives you the result you're looking for.

Then there's what to do on the remote system... Do you want to simply delete what's there first, and then replace it with the most recent file, or do you want to transfer the new files first, and then determine what needs deleted?

For the transfer, you'll probably want to use something like rsync or 'scp', but ftp could work as well. I tend to use scp with key-based authentication so that it can be automated without requiring you to enter a password every time.

You could also mount the remote filesystem with NFS or sshfs so that it can be treated like a local filesystem. That might be easier in this case, since you may have to script the removal of the OLD backup files.

So, I got the code to do pretty much what I wanted. I was wondering if you would be kind enough to take a look at it and tell me if there could be any improvements I could make?

Just a note, in this code I just have it echoing what will be done instead of actually doing it (it was still in the testing process, heh.


dirName=" "
newestImg=" "
backupImg=" "

for f in $imageDir
	dirName=`basename $f`
	if [ -d "$backupDir/$dirName" ]; then

		echo "Current directory: $dirName"
		newestImg=`ls -t1 $f | head -n1`
		echo "Newest Image: $newestImg"	

		backupImg=`ls -t1 $backupDir/$dirName | head -n1`
		echo "Newest Backup Image: $backupImg"

		if(test $imageBaseDir/$dirName/$newestImg -nt $backupDir/$dirName/$backupImg)
			echo "$newestImg is newer, $backupImg will be replaced"
exit 0

Hi madtorahtut!

Sorry to leave you hanging... Did you try the script? How'd it work out?