Replication and Failover of a Docker Environment: From Proxmox Container to the Cloud

Running a website on a local server (on-premises) offers many advantages in terms of control and customization, but it carries a significant risk: business continuity.
A server in your home or office cannot guarantee uptime. 100% —all it takes is a power outage, a hardware problem, or, as in my case, being away on vacation, and the site could be offline for hours or days.

That's why I decided to implement a solution Replication + Failover to Cloud, which allows you to:

  • Replicate periodically WordPress content and database from local server to cloud
  • Perform an automatic failover (or semi-automatic) in case the local server is offline
  • Still have the ability to manage and update services remotely

How does my solution work?

  1. Local Replication → Cloud
    • Backup scripts of the WordPress web servers and database are run every night.
    • Files are transferred via scp to the cloud server with SSH key authentication
  2. Restore to the Cloud
    • In the morning, cronjobs on the cloud server run the restore scripts: the containers are updated and the database is restored.
  3. Failover
    • If your local server is online, Cloudflare balances the traffic.
    • If the local server is offline, traffic switches to the cloud server without manual intervention
  4. Cloud Management
    • Portainer to manage Docker
    • Fail2Ban and basic security hardening

When the local container is UP:

✅ Handles requests normally

When the local container is DOWN:

✅ Cloudflare routes requests to the cloud VM
✅ The site remains online, services continue to function

Tutorial:

OnPremise:

a) Backup of Webserver Docker Volumes

For each dockerized service (like NGINX or others), I create a backup of the volumes using a temporary Alpine container.
Finally, I move everything to the cloud via SCP.

Example - Sample Web Server:

#!/bin/bash

DATE=$(date +"%Y%m%d-%H%M")
BACKUP_DIR="/root/backups"
CLOUD_USER="insert_username"
CLOUD_IP="insert_cloud_ip"
SSH_KEY="enter_ssh_key"
REMOTE_PATH="/home/inserisci_username/webserver_backup_${DATE}.tar.gz"

mkdir -p ${BACKUP_DIR}
cd ${BACKUP_DIR}

# Backup Docker Volumes
docker run --rm -v insert_volume_id1:/volume -v $(pwd):/backup alpine sh -c "cd /volume && tar czf /backup/nginx_backup.tar.gz ."
docker run --rm -v insert_volume_id2:/volume -v $(pwd):/backup alpine sh -c "cd /volume && tar czf /backup/html_backup.tar.gz ."

# Create single archive and transfer to cloud
tar czf webserver_backup_${DATE}.tar.gz nginx_backup.tar.gz html_backup.tar.gz
scp -i ${SSH_KEY} webserver_backup_${DATE}.tar.gz ${CLOUD_USER}@${CLOUD_IP}:${REMOTE_PATH}

rm -f nginx_backup.tar.gz html_backup.tar.gz

b) WordPress Database Dump

I take a SQL dump from the MySQL/MariaDB container and transfer it to the cloud:

#!/bin/bash

DATE=$(date +"%Y%m%d-%H%M")
LOCAL_DUMP="/root/wordpress_db_backup_${DATE}.sql"
CLOUD_USER="insert_username"
CLOUD_IP="insert_cloud_ip"
SSH_KEY="enter_ssh_key"
DB_USER="insert_db_user"
DB_PASS="enter_db_password"
DB_NAME="insert_db_name"

docker exec insert_db_container_name mysqldump --no-tablespaces -u ${DB_USER} -p"${DB_PASS}" ${DB_NAME} > ${LOCAL_DUMP}

scp -i ${SSH_KEY} ${LOCAL_DUMP} ${CLOUD_USER}@${CLOUD_IP}:/home/insert_username/

c) Automation with Cronjob

Crontab example:

0 3 * * * /root/scripts/replica_webserver.sh
0 4 * * * /root/scripts/replica_wordpress.sh

Cloud:

a) Restoring Web Server Volumes

On the cloud, I unpack the received backups and restore them to Docker volumes:

#!/bin/bash

BACKUP_FILE=$(ls -t /home/insert_username/webserver_backup_*.tar.gz | head -n 1)
RESTORE_DIR="/root/restore_temp"

mkdir -p ${RESTORE_DIR}
tar xzf ${BACKUP_FILE} -C ${RESTORE_DIR}

docker run --rm -v insert_volume_id1:/data -v ${RESTORE_DIR}:/backup alpine sh -c "rm -rf /data/* && tar xzf /backup/nginx_backup.tar.gz -C /data"
docker run --rm -v insert_volume_id2:/data -v ${RESTORE_DIR}:/backup alpine sh -c "rm -rf /data/* && tar xzf /backup/html_backup.tar.gz -C /data"

rm -rf ${RESTORE_DIR}

b) WordPress Database Restore

#!/bin/bash

DUMP_PATH=$(ls -t /home/insert_username/wordpress_db_backup_*.sql | head -n 1)
CONTAINER_DB="enter_container_db_name"
DB_USER="insert_db_user"
DB_PASS="enter_db_password"
DB_NAME="insert_db_name"

docker cp "$DUMP_PATH" "$CONTAINER_DB:/wordpress_db_backup.sql"

docker exec -i "$CONTAINER_DB" mysql -u "$DB_USER" -p"$DB_PASS" "$DB_NAME" -e "
SET FOREIGN_KEY_CHECKS=0;
DROP TABLE IF EXISTS wp_commentmeta, wp_comments, wp_links, wp_options, wp_postmeta, wp_posts, wp_term_relationships, wp_terms, wp_term_taxonomy, wp_usermeta, wp_users;
SET FOREIGN_KEY_CHECKS=1;"

docker exec -i "$CONTAINER_DB" sh -c "mysql -u $DB_USER -p\"$DB_PASS\" $DB_NAME < /wordpress_db_backup.sql"

c) Automation with Cronjob on the Cloud

Crontab example:

0 5 * * * /root/scripts/restore_webserver.sh
0 6 * * * /root/scripts/restore_wordpress.sh

Cloudflare:

On both machines (Proxmox and Cloud), I enable the Cloudflare tunnel:

cloudflared service install insert_token_tunnel

Cloudflare automatically manages traffic based on the availability of your on-premises container or cloud VM.

Leave a Reply

Your email address will not be published. Required fields are marked *