Automatically upload users (and admin) backup files to Google Drive
Posted: Tue Apr 04, 2017 2:14 am
LE I edited the script adding as it would otherwise have been left temporary files on your partition.
I managed to write a script to automatically upload my backups to my Google Drive. Obviously, with Google Drive app installed, that means that automatically synced files will end up on my home machine also, which I find very convenient ;)
So, here it is (any feedback would be great, as I am kind of a newbie when it comes to bash scripting):
If you ever need to restore the backups (hopefully not), you will have to upload the encrypted files in your /home/backup directory and decrypt them one by one (or make a script :P):
then chown accordingly:
Check that everything is OK, then remove the encrypted files.
Credits go to:
Timothy Quinn (https://timothy-quinn.com/backing-up-fi ... a-the-cli/)
Benjamin Cane (http://bencane.com/2013/10/21/5-bash-fo ... efficient/)
and again Tim (https://timothy-quinn.com/backing-up-li ... gle-drive/)
Code: Select all
exit 0
I managed to write a script to automatically upload my backups to my Google Drive. Obviously, with Google Drive app installed, that means that automatically synced files will end up on my home machine also, which I find very convenient ;)
So, here it is (any feedback would be great, as I am kind of a newbie when it comes to bash scripting):
- First of all, you have to get the gdrive client, specifically the binary for your OS (gdrive-linux-x64 for me, as I'm on CentOS 7):
make it executable:
Code: Select all
wget https://docs.google.com/uc?id=0B3X9GlR6EmbnQ0FtZmJJUXEyRTA&export=download
and place it somewhere from where you can call it easily:Code: Select all
chmod +x gdrive
Then, set it up:Code: Select all
mv gdrive /usr/sbin/
Follow the link and get the verification code by authenticating with the google account for the drive you want access to.Code: Select all
gdrive about
-- - Now, let's rock!
That is, in fact, the script. Put it in a file (/usr/local/sbin/backupgdrive.sh) and make it executable by root.
Code: Select all
#!/bin/bash IFS=$'\n' # If you're short on Drive space, you may want to delete previous backups upTime=$(date --date="23 hours ago" '+%Y-%m-%dT%T') gdrive list -q "name contains '.enc' and modifiedTime < '$upTime'" >> drive.txt filename='drive.txt' # Read the list file while read -r line do theId=$(cut -c-2 <<< "$line") #basically skip the first line... if [ "$theId" != "Id" ]; then #send the delete command gdrive delete $(cut -c-28 <<< "$line") fi done < $filename # Remove temp list file rm -rf drive.txt # For every user you have: for USERINFO in `grep ":/home" /etc/passwd` do USERNAME=$(echo $USERINFO | cut -d: -f1) # Declare the variable that will give you the backup filename BAKFILE=$(echo "/home/backup/$USERNAME.$(date '+%Y-%m-%d').tar") # You will upload an encrypted backup to Drive, so, let's call that file a name ENCFILE=$(echo "/tmp/$USERNAME.$(date '+%Y-%m-%d').enc") # Skip users that don't have backups, like user 'backup' or additional ftp users (don't forget to change the username accordingly) if [ "$USERNAME" == "backup" ] || [ "$USERNAME" == "ADD_FTP_USER" ] ; then continue; fi if [ -e "$BAKFILE" ] ; then # If the file exists, encrypt it (don't forget to replace the password below with a strong and long password) openssl aes-256-cbc -a -salt -in "$BAKFILE" -out "$ENCFILE" -pass 'pass:LONGSTRONGPASSWORDHERE' # Copy the Drive folder ID where you'll upload your files (the after the https://drive.google.com/.../folders/...). Upload the encrypted file, then delete it from machine. gdrive upload -p ID --delete "$ENCFILE" fi done exit 0
Now, some more automation would be nice, right?
-- - You might experience some short downtime while the encryption takes place (I'm not quite sure on that), so I would set up the cron job late at night. I created a job in /etc/cron.d:
Paste the following
Code: Select all
vi /etc/cron.d/backup-gdrive
Make sure that the script will be executed at about an hour AFTER Vesta's automatic backup. You will receive an email after the upload is finished, with a summary of the script's output.Code: Select all
MAILTO="" /* You want output mail sent by script, not by crontab */ 15 05 * * * root /usr/local/sbin/backupgdrive.sh > output ; mail -s "Backup files uploaded to Google Drive" -r root [email protected] < output
If you ever need to restore the backups (hopefully not), you will have to upload the encrypted files in your /home/backup directory and decrypt them one by one (or make a script :P):
Code: Select all
openssl aes-256-cbc -d -a -in "/home/backup/USERNAME.DATE.enc" -out "/home/backup/USERNAME.DATE.tar" -pass 'pass:YOURLONGSTRONGPASSWORD'
Code: Select all
chown admin:USERNAME /home/backup/USERNAME.DATE.tar
Credits go to:
Timothy Quinn (https://timothy-quinn.com/backing-up-fi ... a-the-cli/)
Benjamin Cane (http://bencane.com/2013/10/21/5-bash-fo ... efficient/)
and again Tim (https://timothy-quinn.com/backing-up-li ... gle-drive/)