View Single Post
Community Council | Posts: 4,920 | Thanked: 12,867 times | Joined on May 2012 @ Southerrn Finland
#1
Well, lately there were some threads about problems with Dropbox clients etc, and this makes me wonder...

I myself question the need to have an application and some external service just to sync my stuff. It seems a overkill, why do you need to do things the hard way when same (or better) functionality can be achieved with a simple shell script?

I have been syncing my data up&down with this kind of setup. This is easy to implement, portable, robust and beautiful.

1. You need to install few niceties first. This script uses rsync with a crontab trigger, so get those from rzr's Harmattan repository.
(I don't know about Frematle repositories but I belive cron and rsync are there too...)

2. You need to create a pair of rsa keys and install the public key to the cloudservers users authorized_keys, just the standard stuff.
I recommend creating a dummy user just for the sync purposes...

3. Here is the cloudsync.sh script, put it to your "/root" for example:
Code:
#!/bin/sh

G_LOGTAG="cloudsync"
G_LOCKFILE="/var/lock/cloudsync.pid"
G_USERNAME="your_cloudserver_username_goes_here"
G_DNSNAME="dns.name.of.your.cloudserver"
G_LOCALNAME="server.local.ip.address"
G_OWNIP="ip.given.to.me"
G_UPLOAD_DIRS="/home/user/MyDocs/DCIM /home/user/MyDocs/.backups /home/user/MyDocs/AptBackup /home/user/MyDocs/meeTrainer_workouts /home/user/MyDocs/.CallRecorder /home/user/MyDocs/Pictures"

if [ -e "$G_LOCKFILE" ]; then
  logger -t "$G_LOGTAG" "cannot get lock $G_LOCKFILE, exiting"
  exit 1
fi

echo "$$" > "$G_LOCKFILE"

## get the wlan state to see if we're in home network
aa=$(/sbin/ifconfig wlan0 | grep "$G_OWNIP")
if [ "$?" -eq "0" ]; then
  G_CLOUDHOST=$G_LOCALNAME
  logger -t "$G_LOGTAG" "Using WLAN connection to server $G_CLOUDHOST"
else
  G_CLOUDHOST=$G_DNSNAME
  logger -t "$G_LOGTAG" "Using GPRS connection to server $G_CLOUDHOST"
fi

G_SYNC=$(/usr/bin/rsync -av $G_UPLOAD_DIRS $G_USERNAME@$G_CLOUDHOST:)
G_RET=$?

logger -t "$G_LOGTAG" "$G_SYNC"

if [ "$G_RET" -eq "0" ]; then
  logger -t "$G_LOGTAG" "Upload tranfer succeeded"
else
  logger -t "$G_LOGTAG" "Upload transfer failed (error code $G_RET)"
fi

G_SYNC=$(/usr/bin/rsync -rptv $G_USERNAME@$G_CLOUDHOST:upload/ /home/user/MyDocs/Downloads/)
G_RET=$?

logger -t "$G_LOGTAG" "$G_SYNC"

if [ "$G_RET" -eq "0" ]; then                 
  logger -t "$G_LOGTAG" "Download tranfer succeeded"
else
  logger -t "$G_LOGTAG" "Download transfer failed (error code $G_RET)"
fi

rm -f "$G_LOCKFILE"

return 0
4. Edit the variables in the script to your liking:
Code:
G_USERNAME="your_cloudserver_username_goes_here"
G_DNSNAME="dns.name.of.your.cloudserver"
G_LOCALNAME="server.local.ip.address"
G_OWNIP="ip.given.to.me"
5. put something like this in your crontab:
Code:
~ # 
~ # crontab -l

*/30  * * * * /root/cloudsync.sh > /dev/null 2>&1

~ #


DESCRIPTION:

So, for the people who bothered to read to the end of this posting

This script can be modified to sync any desired directories. This example syncs my:
  • DCIM
  • .backups
  • AptBackup
  • meeTrainer_workouts
  • .CallRecorder
  • Pictures
folders, but that's just my preferences. Anything goes.

The folder "upload" on the cloudserver is synced to "Downloads" on the users MyDocs, and same applies, anything goes.

The script detects if you are on home network and uses direct connection to the cloudserver if so. Of course that works this way for me because I have my servers on the same subnet as my WLAN. You might want to modify that setup based on your network configuration...

The crontab entry launches cloudsync every 30 minutes, and additionaly it can be launched at any given moment directly. The script creates a lockfile in /var/lock to prevent double invocation.

Security related thingies;
  • You will need the private rsa key in the roots .ssh/ on the device, which means that if you lose your device the key can be considered compromised. (but of course you have the root password to set to something else than "rootme", don't you?)
  • For this reason I recommend creating a separate user for backup purposes on the cloudserver. Additionally you might want to run that user on chroot jail, or forward the ssh connection to a virtual machine or separate HW.
 

The Following 22 Users Say Thank You to juiceme For This Useful Post: