Koozali.org: home of the SME Server

Using rclone for making back-ups

Offline holck

  • ****
  • 317
  • +1/-0
Using rclone for making back-ups
« on: March 30, 2018, 12:51:28 AM »
As described here: https://wiki.contribs.org/Rclone , you can use rclone for copying your local files to the cloud or vice versa. This may also be interesting for making remote back-ups.

Using dar, I make daily back-ups to a local USB-drive. I do incremental back-ups daily, and a full back-up once a week. In this way I can always go 7 days back in time. I've done a little scripting to be able to use rclone in a similar way.

In order to use the script, you must first download and install rclone. Then you must run
Code: [Select]
rclone config to setup the remote interface.

Here is my first attempt, using Amazon S3:
Code: [Select]
#!/bin/bash

SERVER=$(hostname -s)
REMOTE="amazonS3:${SERVER}"
FILTER="filter.txt"
VERBOSE="-v"                # Add more v's for more verbosity
VERSIONS=7                  # Number of backup versions to keep

# Purge oldest version
VERSION=$VERSIONS
rclone purge ${REMOTE}/old${VERSION} ${VERBOSE}

# Make room for a new backup version
while [ $VERSION -gt 1 ]; do
  (( OLDV=$VERSION-1 ))
  rclone move ${REMOTE}/old${OLDV} ${REMOTE}/old${VERSION} ${VERBOSE}
  (( VERSION = $OLDV ))
done

# Perform backup
rclone sync / ${REMOTE}/recent --backup-dir ${REMOTE}/old1 --filter-from ${FILTER} ${VERBOSE} --skip-links

The script is meant to be run daily. Each time a rclone synchronization is run, new versions of the local files will be copied to the remote, and files no longer existing locally will also be deleted on the remote. This is why I use the --backup-dir option. In this way, before changing or deleting a remote file, it will first be copied to the backup-dir "old1".

Before the synchronization is run, the script makes place for a new "old1" directory - deleting old7, moving old6 to old7, moving old5 to old 6, ..., and old1 to old2.

In order to use the script, you must also make a filter file. It may look like this:
Code: [Select]
# Standard backup set

+ /home/e-smith/**
+ /etc/e-smith/templates-custom/**
+ /etc/e-smith/templates-user-custom/**
+ /etc/ssh/**
+ /root/**
+ /etc/sudoers/**
+ /etc/passwd/**
+ /etc/shadow/**
+ /etc/group/**
+ /etc/gshadow/**
+ /etc/samba/secrets.tdb/**
+ /etc/samba/smbpasswd/**
- *
« Last Edit: March 30, 2018, 11:38:55 AM by holck »
......

Offline brianr

  • *
  • 988
  • +2/-0
Re: Using rclone for making back-ups
« Reply #1 on: March 31, 2018, 08:50:27 AM »
This looks very promising, I am trying on my server - thanks Holk

I've added a link to the forum post to the wiki.
Brian j Read
(retired, for a second time, still got 2 installations though)
The instrument I am playing is my favourite Melodeon.
.........

Offline Jean-Philippe Pialasse

  • *
  • 2,747
  • +11/-0
  • aka Unnilennium
    • http://smeserver.pialasse.com
Re: Using rclone for making back-ups
« Reply #2 on: April 01, 2018, 04:10:22 PM »
I am checking at this too, thanks for pointing to rclone.

I was searching for a way to integrate base SME Backup (console/manager) to the cloud.

This is an important element to get it working.
Next We need to pull together the way to use the encryptionit seems pretty straigth forward with
Rclone.

Would be great to be able to set the cloud using the manager rather than the cli.

Finally, i feel that a rsync type backup file by file rather than on archive base could be more efficient for remote backup.

Considering the amount of data, and the time to transfer, a disaster restore would be easier with local backup, but the remote backup could be a last resort, or something to put over the local outdated restored backup. This means only a few Mb / Gb would need to be transfered back.

Also, this could be practical to get a few files a user deleted.

Offline brianr

  • *
  • 988
  • +2/-0
Re: Using rclone for making back-ups
« Reply #3 on: April 01, 2018, 06:13:12 PM »
Considering the amount of data, and the time to transfer, a disaster restore would be easier with local backup, but the remote backup could be a last resort, or something to put over the local outdated restored backup. This means only a few Mb / Gb would need to be transfered back.

Also, this could be practical to get a few files a user deleted.

The big downside is the time taken to create the initial full archive. I tried one of my servers with about 450gb of data and it took over 2 days to create it. In-fact I stopped it before the end (which I regret now) I'd like to have seen how long the incremental backup then took.

Compression would speed up the process, rsync provides compression just between the client and server to speed up the actual backup, but I see no sign of that in rclone, and sending the compressed files would destroy the incremental facility (?).

Brian j Read
(retired, for a second time, still got 2 installations though)
The instrument I am playing is my favourite Melodeon.
.........

Offline Jean-Philippe Pialasse

  • *
  • 2,747
  • +11/-0
  • aka Unnilennium
    • http://smeserver.pialasse.com
Re: Using rclone for making back-ups
« Reply #4 on: April 01, 2018, 06:30:13 PM »
The big downside is the time taken to create the initial full archive. I tried one of my servers with about 450gb of data and it took over 2 days to create it. In-fact I stopped it before the end (which I regret now) I'd like to have seen how long the incremental backup then took.

Compression would speed up the process, rsync provides compression just between the client and server to speed up the actual backup, but I see no sign of that in rclone, and sending the compressed files would destroy the incremental facility (?).

that indeed the drawback, I especially think of OVH Hubic with their limitation to 10Mbits.

Hence, incremental initial backup will not be faster, but at least you could stop and start when you want, if like me you have a monthly bandwidth limitation, with free for all during the night.


Looking at the documentation I understood that by default it asks for gzip compression, except if you add --no-gzip-encoding, which you should do if you send already compressed archives. https://rclone.org/docs/#no-gzip-encoding


Offline brianr

  • *
  • 988
  • +2/-0
Re: Using rclone for making back-ups
« Reply #5 on: April 01, 2018, 06:45:36 PM »
Looking at the documentation I understood that by default it asks for gzip compression, except if you add --no-gzip-encoding, which you should do if you send already compressed archives. https://rclone.org/docs/#no-gzip-encoding

I missed that - so it is already compressing the files. That is not good news!

Although I wonder if dropbox actually supports that in transmission compression?

My server was on a nominal 60/30mbps line, actual is about 35/15. Still was taking too long to be useful.
Brian j Read
(retired, for a second time, still got 2 installations though)
The instrument I am playing is my favourite Melodeon.
.........

Offline Jean-Philippe Pialasse

  • *
  • 2,747
  • +11/-0
  • aka Unnilennium
    • http://smeserver.pialasse.com
Re: Using rclone for making back-ups
« Reply #6 on: April 01, 2018, 07:15:16 PM »
I missed that - so it is already compressing the files. That is not good news!

Although I wonder if dropbox actually supports that in transmission compression?
you get the point, this depends if the service offers it. However, my guess is that bandwidth is more expensive than CPU usage, so they might tend to offer it.

My server was on a nominal 60/30mbps line, actual is about 35/15. Still was taking too long to be useful.

yep, that is why I suggest a local weekly or daily-ish backup with the auto remote backup every night. This way you have most of your data restored quickly and then just sync back from the cloud what has changed: usually for what I see it is few hundred Mb up tor 2 Gb per day.

Offline brianr

  • *
  • 988
  • +2/-0
Re: Using rclone for making back-ups
« Reply #7 on: April 01, 2018, 09:31:08 PM »
yep, that is why I suggest a local weekly or daily-ish backup with the auto remote backup every night. This way you have most of your data restored quickly and then just sync back from the cloud what has changed: usually for what I see it is few hundred Mb up tor 2 Gb per day.

However you still need one full backup on the cloud site to use as a reference point.  I guess you could refresh that once a month?
Brian j Read
(retired, for a second time, still got 2 installations though)
The instrument I am playing is my favourite Melodeon.
.........

Offline Jean-Philippe Pialasse

  • *
  • 2,747
  • +11/-0
  • aka Unnilennium
    • http://smeserver.pialasse.com
Re: Using rclone for making back-ups
« Reply #8 on: April 01, 2018, 11:22:22 PM »
However you still need one full backup on the cloud site to use as a reference point.  I guess you could refresh that once a month?
Well the pain is to create the first one. Then you only sync on the full backup what has changed.

There are 2 ways of using full backup and incremental:
* Traditional , what is done with dar, and with backuppc 3 : you have a full backup then save what is new in incremental dir. When you restore you need to apply the full, then all the last incremental

* Reversed, what does backuppc 4 : your last backup is always the full backup , and the incremental is a secondary backup of what was about to be thrown with the update.

If you do daily an update of your remote backup you always have your full up to date, and still have old elements in a --backup-dir, but unless the disaster is a colleague having thrown away its entire work of previous day, you do not need it.

rclone uses this possible behavior of rsync. This is great!

So still the big issue is to send all data on the first time. Then you can juste sync the changes every day, it is usually as I said 200Mo -2Go which is doable during a night.

Offline Bud

  • *
  • 487
  • +0/-0
Re: Using rclone for making back-ups
« Reply #9 on: November 22, 2018, 07:57:28 AM »
guys please can you help

i am trying to use rclone to sync my google drive to a sme 9.2 server

i have looked at the configuration for dropbox

https://wiki.contribs.org/Rclone

Useage ( Doprbox )
Here is the shell script that I developed and tested to see if it would work:

!/bin/sh
signal-event pre-backup
cd /
rclone mkdir smeserver-dropbox:backup
rclone mkdir smeserver-dropbox:backup/smeserver
rclone mkdir smeserver-dropbox:backup/smeserver/etc/e-smith/templates-user-custom
rclone mkdir smeserver-dropbox:backup/smeserver/etc/e-smith/templates-custom
rclone mkdir smeserver-dropbox:backup/smeserver/home
rclone mkdir smeserver-dropbox:backup/smeserver/ssh
rclone mkdir smeserver-dropbox:backup/smeserver/root
rclone copy -v  / smeserver-dropbox:backup/smeserver --files-from backup-files-list-rclone
rclone copy -v  "etc/e-smith/templates-custom/" "smeserver-dropbox:backup/smeserver/etc/e-smith/templates-custom/"
rclone copy -v  "etc/e-smith/templates-user-custom/" "smeserver-dropbox:backup/smeserver/etc/e-smith/templates-user-custom/"
rclone copy -v  etc/ssh smeserver-dropbox:backup/smeserver/ssh
rclone copy -v  root smeserver-dropbox:backup/smeserver/root
rclone copy -v  home/e-smith/ smeserver-dropbox:backup/smeserver/home/ --exclude "tmp/**"

how would i create a shell script for my google drive?

Offline ReetP

  • *
  • 3,722
  • +5/-0
Re: Using rclone for making back-ups
« Reply #10 on: November 22, 2018, 11:47:44 PM »
First read the rclone docs eg

https://rclone.org/drive/

Get a basic connection to Drive first.

Worry about a sync script after.
...
1. Read the Manual
2. Read the Wiki
3. Don't ask for support on Unsupported versions of software
4. I have a job, wife, and kids and do this in my spare time. If you want something fixed, please help.

Bugs are easier than you think: http://wiki.contribs.org/Bugzilla_Help

If you love SME and don't want to lose it, join in: http://wiki.contribs.org/Koozali_Foundation

Offline Bud

  • *
  • 487
  • +0/-0
Re: Using rclone for making back-ups
« Reply #11 on: November 23, 2018, 12:03:10 AM »
ReetP thank you for your help, much appreciated

i have installed rclone and have run the rclone config setup

when i issue the command rclone lsd remote: i can list directories in top level of my google drive

when i issue the command rclone ls remote: i can list all the files in my google drive

so rclone has been configured correctly on my sme 9.2 server

what i am trying to understand is the following:
1. how do i get the rclone sync to start downloading my google drive directories and files on to the sme 9.2 server
2. where exactly does all the directories and files get stored on the sme 9.2 server
3. how do i get rclone to autosync


any help greatly appreciated  :-)

Offline ReetP

  • *
  • 3,722
  • +5/-0
Re: Using rclone for making back-ups
« Reply #12 on: November 23, 2018, 12:26:26 AM »
Briam who wrote tge wiki page may pop in and help you. Can't do.anything myself right now as it is way late.

If I get 5 minutes tomorrow I'll take a look.

Note his script I think backs up from

SME -> Google

As far as bash scripting goes you really need to go teach yourself some simple scripting for practice. It will help...... otherwise you are basically asking someome to write it all for you.

There are some great guides online. Pick a few and start coding.....

Just make sure you set restrictive permissions on any file. Start with 0700....

http://tldp.org/LDP/Bash-Beginners-Guide/html/

https://linuxconfig.org/bash-scripting-tutorial

...
1. Read the Manual
2. Read the Wiki
3. Don't ask for support on Unsupported versions of software
4. I have a job, wife, and kids and do this in my spare time. If you want something fixed, please help.

Bugs are easier than you think: http://wiki.contribs.org/Bugzilla_Help

If you love SME and don't want to lose it, join in: http://wiki.contribs.org/Koozali_Foundation

Offline brianr

  • *
  • 988
  • +2/-0
Re: Using rclone for making back-ups
« Reply #13 on: November 23, 2018, 08:08:15 AM »
when i issue the command rclone lsd remote: i can list directories in top level of my google drive

So that means that you ought to be able to replace the "smeserver-dropbox" in my script by "remote" and it might well work!!

As John says you will really need to get on top of bash shell scripting in order to make this work for you though.
Brian j Read
(retired, for a second time, still got 2 installations though)
The instrument I am playing is my favourite Melodeon.
.........

Offline holck

  • ****
  • 317
  • +1/-0
Re: Using rclone for making back-ups
« Reply #14 on: December 10, 2018, 10:28:07 AM »
I've got my backup script working nicely with Amazon S3. The first backup, of course, took some time (6 hours for me), but the daily backup is a lot faster (30 minutes).

Code: [Select]
#!/bin/bash

SERVER=$(hostname -s)
REMOTE="amazonS3:${SERVER}"
FILTER="/home/e-smith/files/users/holck/dev/rclone/filter.txt"
VERBOSE=""                  # Add more v's for more verbosity: -v = info, -vv = debug
VERSIONS=7                  # Number of backup versions to keep
LOG="/tmp/rclone_log.txt"   # Temporary log file

# Purge oldest version
VERSION=$VERSIONS
date
echo "Purging oldest version ($VERSIONS)..."
nice rclone purge ${REMOTE}/old${VERSION} ${VERBOSE} &>> ${LOG}
echo "...done"

# Make room for a new version
echo "Making room for a new version..."
while [ $VERSION -gt 1 ]; do
  (( OLDV=$VERSION-1 ))
  date
  echo "Moving version ${OLDV} to ${VERSION}..."
  nice rclone move ${REMOTE}/old${OLDV} ${REMOTE}/old${VERSION} ${VERBOSE} &>> ${LOG}
  (( VERSION = $OLDV ))
  echo "... done"
done

# Perform backup
date
echo "Performing backup to ${REMOTE}..."
nice rclone sync / ${REMOTE}/recent --backup-dir ${REMOTE}/old1 --filter-from ${FILTER} ${VERBOSE} --skip-links --update --use-server-modtime &>> ${LOG}
echo "...done"
date
cat ${LOG} >> log.txt
In this way, I always have a full backup. And if I a file was changed or deleted some days (< 7) ago, I can still find the old version.

I create a logfile in /tmp, as I don't backup that directory. If you create a logfile in a directory, you want to backup, rclone will complain, as the logfile is constantly changing...

Moving (renaming) a remote directory to another is rather slow, as AWS S3 doesn't support this. So rclone has to download each and every file in the directory and upload them again.

Here is a sample output of the script:
Code: [Select]
Mon Dec 10 08:40:01 CET 2018
Purging oldest version (7)...
...done
Making room for a new version...
Mon Dec 10 08:40:02 CET 2018
Moving version 6 to 7...
... done
Mon Dec 10 08:40:18 CET 2018
Moving version 5 to 6...
... done
Mon Dec 10 08:40:18 CET 2018
Moving version 4 to 5...
... done
Mon Dec 10 08:40:49 CET 2018
Moving version 3 to 4...
... done
Mon Dec 10 08:40:51 CET 2018
Moving version 2 to 3...
... done
Mon Dec 10 09:07:43 CET 2018
Moving version 1 to 2...
... done
Mon Dec 10 09:08:11 CET 2018
Performing backup to amazonS3:karoline...
...done
Mon Dec 10 09:13:46 CET 2018
As you can see, the actual backup only took 5-6 minutes. Most of the time was spent moving the remote directories around.
......