Paolo Ronco
PORTFOLIO
Running a website on a local server (on-premises) offers many advantages in terms of control and customization, but it carries a significant risk: business continuity.A server in your home or office cannot guarantee uptime. 100% —all it takes is a power outage, a hardware problem, or, as in my case, being away on vacation, and the site could be offline for hours or days.
That's why I decided to implement a solution Replication + Failover to Cloud, which allows you to:
scp
✅ Handles requests normally
✅ Cloudflare routes requests to the cloud VM✅ The site remains online, services continue to function
OnPremise:
For each dockerized service (like NGINX or others), I create a backup of the volumes using a temporary Alpine container.Finally, I move everything to the cloud via SCP.
Example - Sample Web Server:
#!/bin/bashDATE=$(date +"%Y%m%d-%H%M")BACKUP_DIR="/root/backups"CLOUD_USER="insert_username"CLOUD_IP="insert_cloud_ip"SSH_KEY="enter_ssh_key"REMOTE_PATH="/home/inserisci_username/webserver_backup_${DATE}.tar.gz"mkdir -p ${BACKUP_DIR}cd ${BACKUP_DIR}# Backup Docker Volumesdocker run --rm -v insert_volume_id1:/volume -v $(pwd):/backup alpine sh -c "cd /volume && tar czf /backup/nginx_backup.tar.gz ."docker run --rm -v insert_volume_id2:/volume -v $(pwd):/backup alpine sh -c "cd /volume && tar czf /backup/html_backup.tar.gz ."# Create single archive and transfer to cloudtar czf webserver_backup_${DATE}.tar.gz nginx_backup.tar.gz html_backup.tar.gzscp -i ${SSH_KEY} webserver_backup_${DATE}.tar.gz ${CLOUD_USER}@${CLOUD_IP}:${REMOTE_PATH}rm -f nginx_backup.tar.gz html_backup.tar.gz
I take a SQL dump from the MySQL/MariaDB container and transfer it to the cloud:
#!/bin/bashDATE=$(date +"%Y%m%d-%H%M")LOCAL_DUMP="/root/wordpress_db_backup_${DATE}.sql"CLOUD_USER="insert_username"CLOUD_IP="insert_cloud_ip"SSH_KEY="enter_ssh_key"DB_USER="insert_db_user"DB_PASS="enter_db_password"DB_NAME="insert_db_name"docker exec insert_db_container_name mysqldump --no-tablespaces -u ${DB_USER} -p"${DB_PASS}" ${DB_NAME} > ${LOCAL_DUMP}scp -i ${SSH_KEY} ${LOCAL_DUMP} ${CLOUD_USER}@${CLOUD_IP}:/home/insert_username/
Crontab example:
0 3 * * * /root/scripts/replica_webserver.sh0 4 * * * /root/scripts/replica_wordpress.sh
Cloud:
On the cloud, I unpack the received backups and restore them to Docker volumes:
#!/bin/bashBACKUP_FILE=$(ls -t /home/insert_username/webserver_backup_*.tar.gz | head -n 1)RESTORE_DIR="/root/restore_temp"mkdir -p ${RESTORE_DIR}tar xzf ${BACKUP_FILE} -C ${RESTORE_DIR}docker run --rm -v insert_volume_id1:/data -v ${RESTORE_DIR}:/backup alpine sh -c "rm -rf /data/* && tar xzf /backup/nginx_backup.tar.gz -C /data"docker run --rm -v insert_volume_id2:/data -v ${RESTORE_DIR}:/backup alpine sh -c "rm -rf /data/* && tar xzf /backup/html_backup.tar.gz -C /data"rm -rf ${RESTORE_DIR}
#!/bin/bashDUMP_PATH=$(ls -t /home/insert_username/wordpress_db_backup_*.sql | head -n 1)CONTAINER_DB="enter_container_db_name"DB_USER="insert_db_user"DB_PASS="enter_db_password"DB_NAME="insert_db_name"docker cp "$DUMP_PATH" "$CONTAINER_DB:/wordpress_db_backup.sql"docker exec -i "$CONTAINER_DB" mysql -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" -e "SET FOREIGN_KEY_CHECKS=0;DROP TABLE IF EXISTS wp_commentmeta, wp_comments, wp_links, wp_options, wp_postmeta, wp_posts, wp_term_relationships, wp_terms, wp_term_taxonomy, wp_usermeta, wp_users;SET FOREIGN_KEY_CHECKS=1;"docker exec -i "$CONTAINER_DB" sh -c "mysql -u $DB_USER -p\"$DB_PASS\" $DB_NAME < /wordpress_db_backup.sql"
0 5 * * * /root/scripts/restore_webserver.sh0 6 * * * /root/scripts/restore_wordpress.sh
Cloudflare:
On both machines (Proxmox and Cloud), I enable the Cloudflare tunnel:
cloudflared service install insert_token_tunnel
Cloudflare automatically manages traffic based on the availability of your on-premises container or cloud VM.
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Save my name, email, and website in this browser for the next time I comment.
Post Comment