(Dec 29) New version 0.9.8-18 has been released

Automatically upload users (and admin) backup files to Google Drive Topic is solved

Section with additional software for Vesta
Forum rules
Before creating a new topic or reply on the forum you should fill out additional fields "Os" and "Web" in your profile section.
In case of violation, the topic can be closed or response from the support will not be received.
batmanu
Posts: 1
Joined: Tue Apr 04, 2017 12:44 am

Automatically upload users (and admin) backup files to Google Drive  Topic is solved

Postby batmanu » Tue Apr 04, 2017 2:14 am

LE I edited the script adding

Code: Select all

exit 0
as it would otherwise have been left temporary files on your partition.

I managed to write a script to automatically upload my backups to my Google Drive. Obviously, with Google Drive app installed, that means that automatically synced files will end up on my home machine also, which I find very convenient ;)

So, here it is (any feedback would be great, as I am kind of a newbie when it comes to bash scripting):

  1. First of all, you have to get the gdrive client, specifically the binary for your OS (gdrive-linux-x64 for me, as I'm on CentOS 7):

    Code: Select all

    wget https://docs.google.com/uc?id=0B3X9GlR6EmbnQ0FtZmJJUXEyRTA&export=download

    make it executable:

    Code: Select all

    chmod +x gdrive

    and place it somewhere from where you can call it easily:

    Code: Select all

    mv gdrive /usr/sbin/

    Then, set it up:

    Code: Select all

    gdrive about

    Follow the link and get the verification code by authenticating with the google account for the drive you want access to.
    --
  2. Now, let's rock!

    Code: Select all

    #!/bin/bash
    IFS=$'\n'

    # If you're short on Drive space, you may want to delete previous backups
    upTime=$(date --date="23 hours ago" '+%Y-%m-%dT%T')
    gdrive list -q "name contains '.enc' and modifiedTime < '$upTime'" >> drive.txt
    filename='drive.txt'
    # Read the list file
    while read -r line
    do
       theId=$(cut -c-2 <<< "$line")
       #basically skip the first line...
       if [ "$theId" != "Id" ]; then
            #send the delete command
            gdrive delete $(cut -c-28 <<< "$line")
       fi
    done < $filename
    # Remove temp list file
    rm -rf drive.txt

    # For every user you have:
    for USERINFO in `grep ":/home" /etc/passwd`
    do
     USERNAME=$(echo $USERINFO | cut -d: -f1)
     
     # Declare the variable that will give you the backup filename
     BAKFILE=$(echo "/home/backup/$USERNAME.$(date '+%Y-%m-%d').tar")
     
     # You will upload an encrypted backup to Drive, so, let's call that file a name
     ENCFILE=$(echo "/tmp/$USERNAME.$(date '+%Y-%m-%d').enc")
     
     # Skip users that don't have backups, like user 'backup' or additional ftp users (don't forget to change the username accordingly)
     if [ "$USERNAME" == "backup" ] || [ "$USERNAME" == "ADD_FTP_USER" ] ; then
      continue;
     fi
     if [ -e "$BAKFILE" ] ; then
      # If the file exists, encrypt it (don't forget to replace the password below with a strong and long password)
      openssl aes-256-cbc -a -salt -in "$BAKFILE" -out "$ENCFILE" -pass 'pass:LONGSTRONGPASSWORDHERE'
     
      # Copy the Drive folder ID where you'll upload your files (the after the https://drive.google.com/.../folders/...). Upload the encrypted file, then delete it from machine.
      gdrive upload -p ID --delete "$ENCFILE"
     fi
    done

    exit 0

    That is, in fact, the script. Put it in a file (/usr/local/sbin/backupgdrive.sh) and make it executable by root.

    Now, some more automation would be nice, right?
    --
  3. You might experience some short downtime while the encryption takes place (I'm not quite sure on that), so I would set up the cron job late at night. I created a job in /etc/cron.d:

    Code: Select all

    vi /etc/cron.d/backup-gdrive

    Paste the following

    Code: Select all

    MAILTO="" /* You want output mail sent by script, not by crontab */
    15 05 *  *  * root /usr/local/sbin/backupgdrive.sh > output ; mail -s "Backup files uploaded to Google Drive" -r root YOU@YOURDOMAIN.COM < output


    Make sure that the script will be executed at about an hour AFTER Vesta's automatic backup. You will receive an email after the upload is finished, with a summary of the script's output.

And that's about it!

If you ever need to restore the backups (hopefully not), you will have to upload the encrypted files in your /home/backup directory and decrypt them one by one (or make a script :P):

Code: Select all

openssl aes-256-cbc -d -a -in "/home/backup/USERNAME.DATE.enc" -out "/home/backup/USERNAME.DATE.tar" -pass 'pass:YOURLONGSTRONGPASSWORD'

then chown accordingly:

Code: Select all

chown admin:USERNAME /home/backup/USERNAME.DATE.tar

Check that everything is OK, then remove the encrypted files.


Credits go to:
Timothy Quinn (https://timothy-quinn.com/backing-up-fi ... a-the-cli/)
Benjamin Cane (http://bencane.com/2013/10/21/5-bash-fo ... efficient/)
and again Tim (https://timothy-quinn.com/backing-up-li ... gle-drive/)
Last edited by batmanu on Wed Apr 12, 2017 5:46 am, edited 4 times in total.

vikhyat
Posts: 70
Joined: Wed Sep 14, 2016 5:39 pm

Re: Automatically upload users (and admin) backup files to Google Drive

Postby vikhyat » Sat Apr 08, 2017 9:02 pm

Thanks a lot for this script. I will test it tomorrow and give a feedback.

MistaWongX
Posts: 2
Joined: Fri May 05, 2017 10:46 am

Re: Automatically upload users (and admin) backup files to Google Drive

Postby MistaWongX » Fri May 05, 2017 10:53 am

nice. was seeking answers to the same question. got redirected from another one of the threads

rhyker2u
Posts: 40
Joined: Thu Jan 19, 2017 11:46 am
Contact:

Os: Ubuntu 16x
Web: nginx + php-fpm

Re: Automatically upload users (and admin) backup files to Google Drive

Postby rhyker2u » Wed Jan 03, 2018 10:42 am

thats one way to do things, here's another (which I found out about yesterday due to VestaCP migrations to a new server farm): http://crossftp.com/commander.htm

CrossFTP Commander is a command line tool based on CrossFTP engine to handle data transfer, sync, and backup operations. It has small memory footprint (core library size is about 4MB), and can be integrated in the shell script or system scheduler. The supported protocols include FTP, SFTP, FTPS, WebDav, Amazon S3, and Google Storage protocols.


However will definitely checkout your solution too.


Return to “3rd Party Software”



Who is online

Users browsing this forum: No registered users and 1 guest

cron