backup and restore server

h87

New Member
#1
as this server has no main control panel to backups

what is the best way to backup openlitespeed server?

want to backup settings / files / databases for an easy restore, anyone done this?
 

Cold-Egg

Administrator
#2
OpenLiteSpeed default installation path on the system is '/usr/local/lsws', so backing up the folder should be good unless you have customized config to another location. You might want to back up sites' files and DB in addition.
 

h87

New Member
#3
is there any command i can run to compress
- web files
- databases
- '/usr/local/lsws'

and then export to ftp

what are other people doing for this kind of scenario?
 

Cold-Egg

Administrator
#4
I "guess" other users may prepare another server with OLS/LSWS ready, then export/import the database from the old to the new server, and then scp or FTP to transfer the web files to the new server.
 

h87

New Member
#6
are there any other free solutions?

was thinking of running a few commands to zip the files, extract the database and upload every day to a ftp
 
#7
I use two bash scripts and run them on a cron job daily:

For the DB backup creation:

#!/bin/bash

# MySQL credentials
DB_USER="usr"
DB_PASSWORD="pass"

# Databases to backup
DB_NAME_1="db1"
DB_NAME_2="db2"

# Date format YYYYMMDD
DATE=$(date +%Y%m%d)

# Backup directory
BACKUP_DIR="/var/backupdb"

# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"

# Export first database
mysqldump -u "$DB_USER" -p"$DB_PASSWORD" "$DB_NAME_1" > "$BACKUP_DIR/${DB_NAME_1}_$DATE.sql"
if [ $? -eq 0 ]; then
echo "Backup of $DB_NAME_1 successful!"
else
echo "Backup of $DB_NAME_1 failed!"
fi

# Export second database
mysqldump -u "$DB_USER" -p"$DB_PASSWORD" "$DB_NAME_2" > "$BACKUP_DIR/${DB_NAME_2}_$DATE.sql"
if [ $? -eq 0 ]; then
echo "Backup of $DB_NAME_2 successful!"
else
echo "Backup of $DB_NAME_2 failed!"
fi


For the files + databases, creating a single final archive daily:


#!/bin/bash

# Define the current date in YYYYMMDD format
current_date=$(date +%Y%m%d)

# Define the backup filename
backup_filename="complete-auto-backup_$current_date.zip"

# Define the output directory
output_directory="/home"

# Define the backup file path
backup_filepath="$output_directory/$backup_filename"

# Define the folders and files to be included in the backup
folders_and_files=(
"/usr/home"
"/var/www"
"/var/smart-files"
"/var/backupdb/datas_$current_date.sql"
"/var/backupdb/smart_files_$current_date.sql"
)

# Create the zip archive containing the specified folders and files
echo "Creating zip archive..."
zip -r "$backup_filepath" "${folders_and_files[@]}"

# Check if the zip archive was created successfully
if [[ $? -eq 0 ]]; then
echo "Zip archive created successfully: $backup_filepath"
else
echo "Failed to create zip archive."
fi


I have not setup a transfer, but you can use SCP and rSync.
 

h87

New Member
#8
thanks for that, i did the same, stored the password in sshpass and called it as a variable


Code:
sudo apt-get install sshpass
echo "your_scp_password" > ~/.scp_pass
chmod 600 ~/.scp_pass
first script
Code:
#!/bin/bash

# MySQL credentials
DB_USER="usr"
DB_PASSWORD="pass"

# Databases to backup
DB_NAME_1="db1"

# Date format YYYYMMDD
DATE=$(date +%Y%m%d)

# Backup directory
BACKUP_DIR="/home/backup"

# Create backup directory if it doesn't exist
mkdir -p "$BACKUP_DIR"

# Export first database
mysqldump -u "$DB_USER" -p"$DB_PASSWORD" "$DB_NAME_1" > "$BACKUP_DIR/${DB_NAME_1}_$DATE.sql"
if [ $? -eq 0 ]; then
    echo "Backup of $DB_NAME_1 successful!"
else
    echo "Backup of $DB_NAME_1 failed!"
fi

# Securely copy the backup files to the remote server
SSHPASS=$(cat ~/.scp_pass)
sshpass -p "$SSHPASS" scp "$BACKUP_DIR/${DB_NAME_1}_$DATE.sql" account@RSYNCHOST.COM:/path/to/backup/
second script
Code:
#!/bin/bash

# Define the current date in YYYYMMDD format
current_date=$(date +%Y%m%d)

# Define the backup filename
backup_filename="complete-auto-backup_$current_date.zip"

# Define the output directory
output_directory="/home/backup"

# Define the backup file path
backup_filepath="$output_directory/$backup_filename"

# Define the folders and files to be included in the backup
folders_and_files=(
    "/var/www"
    "/var/backupdb/*_${current_date}.sql"
)

# Create the zip archive containing the specified folders and files
echo "Creating zip archive..."
zip -r "$backup_filepath" "${folders_and_files[@]}"

# Check if the zip archive was created successfully
if [[ $? -eq 0 ]]; then
    echo "Zip archive created successfully: $backup_filepath"
else
    echo "Failed to create zip archive."
fi

# Securely copy the backup file to the remote server
SSHPASS=$(cat ~/.scp_pass)
sshpass -p "$SSHPASS" scp "$backup_filepath" account@RSYNCHOST.COM:/path/to/backup/
created the two files and placed them into ~/secure-scripts
Code:
mkdir ~/secure-scripts
chmod 700 ~/secure-scripts/*
i had to add the rsync host into known hosts
Code:
ssh-keyscan -H account@RSYNCHOST.COM >> ~/.ssh/known_hosts
 
#9
Nice touch about the credentials, the way I did it in my scripts is not recommended, security-wise.

I noticed a couple of things:

1. In your first script for backing up the database, you've set the backup folder as:

Code:
BACKUP_DIR="/home/backup"
Currently, the script immediately transfers the database dump via SCP in SQL format.

2. In the second script, you're including /var/backupdb which isn't used in your case:

Code:
# Define the folders and files to be included in the backup

folders_and_files=(  
"/var/www" 
  "/var/backupdb/*_${current_date}.sql" )
Subsequently, it transfers to your backup destination again. To streamline, I suggest connecting once via SCP and transferring a single complete archive for backup.

You can achieve this by simply changing the database backup directory to the one that is included in the final backup (/var/backupdb) and also removing the SCP process from the first script.

PS. When setting the cronjob, make sure to run the database backup script earlier than the second script :)
 
Last edited:
Top