Backup your Webserver with a Script and Cronjob

Bash Shell Script to Backup Webroot on Webserver Performs a full backup of the website document root and creates a tar file for each site. Be sure to edit the configuration options at the beginning of the script to match your environment prior to executing. The end result will be a TAR archive of each website with the name '(current-datestamp)-website-dir-backup-tar.gz'. For Backing up your Website Databases please see our Backup MySQL script.

Usage:

  1. Pull up a terminal or SSH into the target server.

  2. Logon as root

sudo -i

  1. Download the installer script.

wget https://raw.githubusercontent.com/clusterednetworks/backup-www/master/backup-www.sh

  1. Edit the configuration options at the beginning of the script to match your environment prior to executing.

    #----------------------------------------
    # OPTIONS
    #----------------------------------------
    DAYS_TO_KEEP=4    # 0 to keep forever
    WWW_PATH='/var/www'
    BACKUP_PATH='/home/backup/server05/www'
    #---------------------------------------
    
  2. Make the script executable

chmod +x backup-www.sh
  1. Run the script.
./backup-www.sh
  1. Setup a cronjob to run the script daily/weekly if you choose.
    1 1   * * * /etc/backup-www.sh >/dev/null 2>&1
    

Posted in Linux Network Admin Tips, Network Security Tips on Dec 10, 2020