I'm trying to set up a program to create an automatic hotcopy backup of our subversion files; the subversion setup is running SuSE 9.3 Linux and mounts to the backup, which I believe is also running the same.

The script for automation is a simple shell script; it is supposed to manage the mount, run the hotcopy, and leave. However, a subversion hotcopy will error out if the target it's being sent to has anything already there. I'm trying to set up a weekly rotation, but that'll still leave a problem after the first week.

I've tried to put in an rm command, with the -r tag, but there are two files (-r--r--r-- structure, owned by 'nobody') that the rm blarggs over, even with the -f tag attached. Does anyone know of a way I can force these files to delete?

Recommended Answers

All 2 Replies

Why not use incremental dumps after the first hotcopy and save some time.

I'd use tar instead of copying. and create filenames by date

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.