Page 1 of 1

Way to limit backup resources?

Posted: Thu Aug 17, 2017 5:14 pm
by youradds
Hi,

Our server was doing its first backup today, and it used up 100% of the processing power with "gzip". Is there a way to limit the amount of resources a backup can use? We will have people using it all times of the day, so really don't want the site to be on a massive slow down when our customers are using it.

The server has 24gb RAM, SSD hard drive, and 8 core's for the CPU, so there is no reason it should struggle doing a 100gb compression.

Image

Thanks

Andy

Re: Way to limit backup resources?

Posted: Sat Nov 25, 2017 2:59 pm
by websystems
Same question here. Did you solve it somehow?

Re: Way to limit backup resources?

Posted: Sat Nov 25, 2017 3:01 pm
by youradds
The main thing I did, was to disable gzip compression - So in the section where you setup the backups (enabling,disabling, remote backup etc), just set the compression to 0. This will make it run faster, I also did some tweaks in the backup script so it does "nice -10" as part of the big queries, to make it a bit nicer on the server :)

Cheers

Andy

Re: Way to limit backup resources?

Posted: Sat Nov 25, 2017 3:03 pm
by websystems
youradds wrote:The main thing I did, was to disable gzip compression - So in the section where you setup the backups (enabling,disabling, remote backup etc), just set the compression to 0. This will make it run faster, I also did some tweaks in the backup script so it does "nice -10" as part of the big queries, to make it a bit nicer on the server :)
Thanks for the tip Andy :)

BR,
Krzysztof

Re: Way to limit backup resources?

Posted: Sat Nov 25, 2017 3:13 pm
by youradds
No problem. Here is my backup script now ( https://pastebin.com/r3eXtRq6 ). You may notice I have commented out a couple of bits:

Code: Select all

        # Compress archive
        #nice -20 gzip -$BACKUP_GZIP $tmpdir/web/$domain/domain_data.tar
and

Code: Select all

        # Compress archive
        #if [ -e "$tmpdir/mail/$domain/accounts.tar" ]; then
            #nice -20 gzip -$BACKUP_GZIP $tmpdir/mail/$domain/accounts.tar
        #fi
The reason I did this, is because the gzip was killing the server (even on min compression), as the outputted tar file was 95gb - so took forever to process. You can uncomment these if you wanted to (or just do a WinMerge, to compare my version with the live script)

Hope that helps

Andy