I have recently discovered that my music directory and the micro sd card that's in my hand held media player (iBasso DX50) are not in sync, I don't have any idea how that could have happened. There are songs in both my music directory that are that are not on the sd card as well as a few songs that are on my sd card that are not in my music library. I've always used rsync either directly or through the program LuckBackup to keep everything in sync so again I don't know how this could have happened. Note that …

Member Avatar
Member Avatar
+0 forum 5

I've been using rsync and or luckybackup for a while now to sync my desktop's music library to my micro sd card for my digital music player. Today I noticed that my music player wasn't showing one of my albums so I've started investigating and I've found that it's not the music player but is more likely to be rsync, but I don't understand how or why. After syncing I check the sd card manually and sure enough this album is still not beeing synced. On my computer all of my music is stored at `/home/me/media/music/*` inside of this last …

Member Avatar
Member Avatar
+0 forum 15

After running rsync with the `-a --delete source dest` arguments on my music library to my sd card I am getting the following errors. default_perms_for_dir: sys_acl_get_file(Tears For Fears/Donnie Darko, ACL_TYPE_DEFAULT): No such file or directory, falling back on umask rsync: mkstemp "/media/garrett/MUSIC-SD/Tears For Fears/Donnie Darko/.Mad World.mp3.yr2Xyc" failed: No such file or directory (2) rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1183) [sender=3.1.0] I don't really understand either one of these two erros. It's syncing from source to destination so how can it say 'No such file or directory'? If it's not there wouldn't it …

Member Avatar
Member Avatar
+0 forum 7

So I recently had a catastrophe with using the Mega cloud service with my Linux Mint desktop and I got my whole home directory wiped out. Everything got wiped out about an hour into syncing my 151 Gig home directory to the cloud so needless to say most of my data didn't get synced to the cloud. When it got deleted from the cloud the cloud then also deleted it all from my desktop so I lost everything. Upon further investigation I was able to find a small amount of data that did get synced to the cloud in the …

Member Avatar
Member Avatar
+0 forum 7

I'm soliciting advice for performace improvements for creating a weak checksum for file segments (used in the rSync algorithm) Here's what I have so far: def blockchecksums(instream, blocksize=4096): from hashlib import md5 weakhashes = [] stronghashes = [] for chunk in iter(lambda: instream.read(blocksize),""): a = b = 0 l = len(chunk) for n, i in enumerate(bytes(chunk)): a += i b += (l - n)*i weakhashes.append((b << 16) | a) stronghashes.append(md5(chunk).hexdigest()) return weakhashes, stronghashes I haven't had any luck speeding things up using itertools or using c functions (like any() )

Member Avatar
Member Avatar
+0 forum 2

I currently have a Ubuntu 12.04 LTS box serving as a network repository for daily workstation backups for our HQ office. Throughout the day the Win7 workstations use Cobian Backup 11 to backup their user folders to their respective samaba shares on this server. At night this server rsyncs all of the samba folders' contents to an offsite sever in another branch. This setup is working well, with one exception: The backups aren't encrypted. I don't like the idea of having the whole office's unencrypted documents gathered together in one place (technically, two places). Cobian has an option for encrypting …

Member Avatar
Member Avatar
+0 forum 8

Hi, I have setup mirroring on my two Linux servers using rsync. On first server I’m using this command rsync -avzt -e ssh /home/ root@secondserver:/home/ In second server, this rsync -avzt -e ssh /home/ root@firstserver:/home/ The problem is when I’m updating a file in second server and run command on first server rsync replaces old file overwritten by first server to second server. What do I have to do to stop updating new file?

Member Avatar
Member Avatar
+0 forum 1

A friend of mine made this script to backup files from all my ubuntu boxes using lftp to an ftp server i setup on my local windows 7 machine using apache and abilityftp server. However it is only backing up some config files(see attached screen), none of the actual directories are being backed up. I can't figure out if the problem lies in the script or what? [CODE]#!/bin/bash FTPUSER=sanders FTPPASS=law123 SERVERIP="192.168.1.37" DATE=`date +%D` for i in `ls /home/` do echo "mirror -rR /home/$i $i" | lftp -u $FTPUSER,$FTPPASS $SERVERIP echo "User $i Backed Up" done[/CODE]

Member Avatar
Member Avatar
+0 forum 9

So I'm creating a backup GUI in Python which basically asks the user for their username/password and source directory so it can be rsynced over to a remote server. The only trouble I'm coming across is passing the password (entered in the GUI) to the server after I execute the command : [CODE]rsync -options source_path rsync_user@rsync_server:remote_path[/CODE] Since I want the user to authenticate everytime they use the GUI I don't want to setup an automated ssh key session. I looked a bit into Pexpect and Paramiko but expect doesn't seem very secure and I wasn't sure how to configure Paramiko …

Member Avatar
Member Avatar
+0 forum 2

hello i want to copy my home directory to sync with another server's home directory.is there any way in which we have not to store key file or use password.I know rsync is an option but i want to know if there is any better option.

Member Avatar
Member Avatar
+0 forum 1

The End.