I currently have a small script I've written that basically takes an incremental backup of a folder which looks similar to this:

#!/bin/bash

cp -ruf /home/folder4/subfolder/ /home/folder5/
cp -ruf /home/folder3/subfolder/ /home/folder4/
cp -ruf /home/folder2/subfolder/ /home/folder3/
cp -ruf /home/folder1/subfolder/ /home/folder2/

Now this runs fine, only updating the files that have been modified and not asking me if I want to overwrite each individual file. There's only one problem that I can't figure out a solution.

Basically, I want this to remove any files which are not present in a newer version. For example:

In /home/folder1/subfolder/ I start off with 3 files:

file1
file2
file3

I run the script and now /home/folder2/subfolder/ contains the same data as /home/folder1/subfolder/. If I then remove file2 and add file4 and file5 to /home/folder1/subfolder/ the directory listing would be:

file1
file3
file4
file5

If I run the script at this point, the new files would be copied over and if file1 and file3 were modified, they would be copied too but it will not remove file2 from /home/folder2/subfolder/ which is what I want it to do.

I know that I could set the script to remove /home/folder2/subfolder/ then copy over /home/folder1/subfolder/ but that would mean it would copy over all files and not just the updated/new ones which I don't want to do, especially when there's a lot of files to copy over.

Besides removing the whole folder and its contents, is there a way to remove the files which aren't there in the latest copy over? Maybe something that compares the two folders and if there's extra files in one, remove them.

rsync!

Do some thing like rsync -av --delete-after /home/folder1/subfolder/ /home/folder2/subfolder . I *think* that will do what you're looking for!

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.