Skip to content

Commit f318544

Browse files
author
John Rogers
committed
I'm your biggest fan
I'll follow you until you love me Dupli-Duplicati Baby, there's no other superstar You know that I'll be Your Dupli-Duplicati Promise I'll be kind But I won't stop until that boy is mine Baby, you'll be famous Chase you down until you love me Papa-paparazzi
1 parent 1ba5ef3 commit f318544

File tree

8 files changed

+173
-294
lines changed

8 files changed

+173
-294
lines changed

.env.example

Lines changed: 23 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -26,54 +26,30 @@ PGADMIN_EMAIL=your_pgadmin_email@example.com
2626
# Password for the default pgAdmin user login
2727
PGADMIN_PASSWORD=your_pgadmin_password
2828

29-
# --- Backup Agent Configuration (Rclone & Retention) ---
30-
# --- Backup Agent: Active Rclone Destination ---
31-
# Name of the rclone remote (configured below or manually in rclone.conf) to use for uploads.
32-
# This remote MUST be defined either via the RCLONE_REMOTE_<N> variables below OR manually in rclone_config/rclone.conf
33-
# Leave blank to disable cloud upload for this backup run.
34-
RCLONE_REMOTE_NAME=
29+
# --- Backup Agent & Duplicati Configuration ---
3530

36-
# Path within the *active* rclone remote where backups should be stored (e.g., "resolve_backups/production")
37-
# Leave blank if RCLONE_REMOTE_NAME is blank.
38-
RCLONE_REMOTE_PATH=
31+
# Path on the host machine where local .sql.gz backups are stored by backup.sh
32+
# This directory is mounted into the backup-agent container as /backups (writeable)
33+
# and into the duplicati container as /backups (read-only).
34+
LOCAL_BACKUP_PATH=./backups
3935

40-
# Number of days to keep local backups in the ./backups volume (default is 7 if not set)
36+
# Number of days to keep local .sql.gz backups in the LOCAL_BACKUP_PATH volume.
37+
# This cleanup is performed by backup.sh (run by the scheduler).
38+
# Duplicati manages its own remote retention based on its job settings.
4139
BACKUP_RETENTION_DAYS=7
4240

43-
# --- Optional: Dynamic Rclone Remote Configuration ---
44-
# The entrypoint script can dynamically configure rclone remotes based on these variables.
45-
# Define remotes sequentially starting with N=1 (RCLONE_REMOTE_1_...).
46-
# The entrypoint will stop looking when it doesn't find RCLONE_REMOTE_<N>_NAME.
47-
48-
# --- Example 1: Google Drive using Service Account ---
49-
# RCLONE_REMOTE_1_NAME=my_google_drive_backup
50-
# RCLONE_REMOTE_1_TYPE=drive
51-
# # Paste the *entire content* of your Google Cloud Service Account JSON key file below.
52-
# # Ensure it's enclosed in single quotes if it contains special characters, or handle quoting appropriately.
53-
# RCLONE_REMOTE_1_PARAM_SERVICE_ACCOUNT_CREDENTIALS='{ "type": "service_account", ... }'
54-
# # Optional: Specify a Team Drive ID if needed
55-
# # RCLONE_REMOTE_1_PARAM_TEAM_DRIVE=YOUR_TEAM_DRIVE_ID
56-
57-
# --- Example 2: AWS S3 ---
58-
# RCLONE_REMOTE_2_NAME=my_s3_backup
59-
# RCLONE_REMOTE_2_TYPE=s3
60-
# RCLONE_REMOTE_2_PARAM_PROVIDER=AWS # Or other S3-compatible provider
61-
# RCLONE_REMOTE_2_PARAM_ACCESS_KEY_ID=YOUR_AWS_ACCESS_KEY_ID
62-
# RCLONE_REMOTE_2_PARAM_SECRET_ACCESS_KEY=YOUR_AWS_SECRET_ACCESS_KEY
63-
# RCLONE_REMOTE_2_PARAM_REGION=us-east-1 # Or your desired AWS region
64-
# # Optional: Specify storage class, ACL, etc.
65-
# # RCLONE_REMOTE_2_PARAM_STORAGE_CLASS=STANDARD_IA
66-
# # RCLONE_REMOTE_2_PARAM_ACL=private
67-
68-
# --- Example 3: Backblaze B2 ---
69-
# RCLONE_REMOTE_3_NAME=my_b2_backup
70-
# RCLONE_REMOTE_3_TYPE=b2
71-
# RCLONE_REMOTE_3_PARAM_ACCOUNT=YOUR_B2_ACCOUNT_ID_OR_APPLICATION_KEY_ID
72-
# RCLONE_REMOTE_3_PARAM_KEY=YOUR_B2_APPLICATION_KEY
73-
# # Optional: Specify endpoint if needed (e.g., for specific regions)
74-
# # RCLONE_REMOTE_3_PARAM_ENDPOINT=s3.us-west-000.backblazeb2.com
75-
76-
# --- Example 4: Local Filesystem (useful for testing) ---
77-
# RCLONE_REMOTE_4_NAME=local_test_backup
78-
# RCLONE_REMOTE_4_TYPE=local
79-
# # No parameters usually needed for 'local' type unless specifying nounc, case_sensitive etc.
41+
# --- Duplicati Service Configuration ---
42+
# Timezone for the Duplicati container (e.g., Europe/London, America/New_York, Etc/UTC)
43+
# See https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
44+
TZ=Etc/UTC
45+
46+
# Optional: Set a password for the Duplicati Web UI. Uncomment and set the value.
47+
# DUPLICATI_WEBSERVICE_PASSWORD=your_secure_password
48+
49+
# --- Duplicati Job Setup (Done via Web UI at http://localhost:8200) ---
50+
# - General: Give your backup job a name. Encryption is recommended.
51+
# - Destination: Choose your storage provider (e.g., Google Drive, S3, B2) and configure credentials/bucket details.
52+
# - Source Data: Select the '/backups' folder inside the container.
53+
# - Schedule: Set how often Duplicati should check for new files and upload them (e.g., "Run daily at 3:00 AM").
54+
# Note: backup.sh creates new .sql.gz files hourly (via Ofelia). Duplicati will upload any new files found since its last run.
55+
# - Options: Configure remote retention (e.g., "Keep backups for 30 Days"). This controls how long backups are kept *at the destination*.

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,4 +36,5 @@ Thumbs.db
3636
/backups
3737

3838
#plan
39-
PLAN_rclone_config.md
39+
PLAN_rclone_config.md
40+
/rclone_config

0 commit comments

Comments
 (0)