Skip to content

Commit c6aa88a

Browse files
committed
Prepare v2.5.0 release
- Bump version to 2.5.0 in setup.py and Makefile - README: fix S3 section - credentials are nested under credentials object, not top-level - README: fix GS section - remove boto2/.boto references, reflect actual gcs_creds_path config - README: replace real-looking AWS key placeholders with YOUR_ACCESS_KEY_ID style - README: update source/destination overview lists to include all new drivers
1 parent b11716b commit c6aa88a

File tree

3 files changed

+41
-41
lines changed

3 files changed

+41
-41
lines changed

Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
VERSION=2.4.1
1+
VERSION=2.5.0
22

33
test:
44
docker build . -t rossigee/backups:test && \

README.md

Lines changed: 39 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -362,8 +362,8 @@ This method allows us to specify a set of RDS/EC2 volumes. When sourced, this wi
362362
"type": "snapshot",
363363
"instancename": "maindb1",
364364
"credentials": {
365-
"aws_access_key_id": "AKIAJPG7RJVVKWT3UX3A",
366-
"aws_secret_access_key": "Owfs3nErv1yQl5cyYfSYeCmfgBWLle9H+oE86KZi"
365+
"aws_access_key_id": "YOUR_ACCESS_KEY_ID",
366+
"aws_secret_access_key": "YOUR_SECRET_ACCESS_KEY"
367367
}
368368
}
369369
```
@@ -465,68 +465,68 @@ You can specify an S3 bucket to back up to.
465465

466466
```json
467467
{
468-
"bucket": "backups-123456789",
468+
"id": "s3-backup",
469+
"type": "s3",
470+
"bucket": "my-backup-bucket",
471+
"region": "eu-west-1",
472+
"credentials": {
473+
"aws_access_key_id": "YOUR_ACCESS_KEY_ID",
474+
"aws_secret_access_key": "YOUR_SECRET_ACCESS_KEY"
475+
},
476+
"retention_copies": 7,
477+
"retention_days": 30
469478
}
470479
```
471480

472-
The 'aws' CLI client gets it's authentication credentials and other configuration from the 'backups' user's '~/.aws/config' file. This needs to be configured as per instructions in the AWS CLI documentation.
481+
If `credentials` are omitted, the AWS CLI will fall back to the standard credential chain (`~/.aws/credentials`, IAM instance role, environment variables, etc.).
482+
483+
For S3-compatible services (e.g. MinIO, Wasabi), supply `endpoint_url`:
473484

474-
Additionally, the S3 destination provides some simple backup rotation options. After a successful backup, the backup files are listed and the 'retention_copies' and 'retention_days' options, if present, are applied to identify and remove any backups that are no longer required.
485+
```json
486+
{
487+
"endpoint_url": "https://s3.wasabisys.com"
488+
}
489+
```
475490

476491
Parameters available in 's3':
477492

478493
| Config key | Purpose |
479494
|------------|---------|
480-
| bucket | S3 bucket to dump files to. |
481-
| region | AWS availability zone. |
482-
| aws_access_key_id | AWS access key. |
483-
| aws_secret_access_key | AWS secret access key. |
495+
| bucket | S3 bucket name. |
496+
| region | AWS region. |
497+
| credentials.aws_access_key_id | AWS access key ID (optional). |
498+
| credentials.aws_secret_access_key | AWS secret access key (optional). |
484499
| retention_copies | How many copies of older backups to keep. |
485-
| retention_days | How many days of backups to keep. |
486-
| endpoint_url | (optional) Endpoint URL for S3 service |
500+
| retention_days | How many days of backups to keep. |
501+
| endpoint_url | S3-compatible endpoint URL (optional). |
487502

488503

489504
Destination - GS
490505
----------------
491506

492-
You can specify a GS bucket to back up to.
507+
You can specify a Google Cloud Storage bucket to back up to. Uploads use `gsutil` and retention management uses the `google-cloud-storage` Python SDK.
493508

494509
```json
495510
{
496-
"bucket": "backups-123456789",
511+
"id": "gcs-backup",
512+
"type": "gs",
513+
"bucket": "my-backup-bucket",
514+
"gcs_creds_path": "/etc/backups/gcs-service-account.json",
515+
"retention_copies": 7,
516+
"retention_days": 30
497517
}
498518
```
499519

500-
The 'gs' destination module uses the boto library in conjunction with 'gsutil.' The 'gsutil' CLI client gets it's authentication credentials and other configuration from the 'backups' user's '~/.boto' file.
501-
502-
The GS module requires a GCP service account to be created with appropriate permissions to write and delete from GS buckets. The key file needs to be in P12 format. IMPORTANT: Properly secure this file and related information.
503-
504-
More information on configuring gsutil and boto as well as preparing a service account can be found at https://cloud.google.com/storage/docs/boto-plugin.
505-
506-
The boto file should contain entries similar to:
507-
```
508-
[Credentials]
509-
gs_service_client_id = some-service-account@your-project.iam.gserviceaccount.com
510-
gs_service_key_file = /some/path/to/your/service-account-credential-file.p12
511-
gs_service_key_file_password = asecretpassword
512-
513-
[GSUtil]
514-
default_api_version = 2
515-
```
516-
AWS and GCP credential data can happily share the same section.
517-
518-
Additionally, the GS destination provides some simple backup rotation options. After a successful backup, the backup files are listed and the 'retention_copies' and 'retention_days' options, if present, are applied to identify and remove any backups that are no longer required.
520+
Create a GCP service account with Storage Object Admin on the target bucket, download the JSON key file, and place it on the backup host. Ensure `gsutil` is authenticated with the same service account (e.g. via `GOOGLE_APPLICATION_CREDENTIALS` or `gcloud auth activate-service-account`).
519521

520522
Parameters available in 'gs':
521523

522524
| Config key | Purpose |
523525
|------------|---------|
524-
| bucket | GS bucket to dump files to. |
525-
| gs_service_client_id | GCP service account. |
526-
| gs_service_key_file | Location of service account file (in P12 format). |
527-
| gs_service_key_file_password | Password for service account file. |
526+
| bucket | GCS bucket name. |
527+
| gcs_creds_path | Path to the GCP service account JSON key file. |
528528
| retention_copies | How many copies of older backups to keep. |
529-
| retention_days | How many days of backups to keep. |
529+
| retention_days | How many days of backups to keep. |
530530

531531

532532
Destination - Samba
@@ -981,8 +981,8 @@ This simple example backs up some folders and a database, and deposits them to a
981981
"bucket": "mybucketnamehere",
982982
"region": "eu-west-1",
983983
"credentials": {
984-
"aws_access_key_id": "AKIAJPG7RJVVKWT3UX3A",
985-
"aws_secret_access_key": "Owfs3nErv1yQl5cyYfSYeCmfgBWLle9H+oE86KZi"
984+
"aws_access_key_id": "YOUR_ACCESS_KEY_ID",
985+
"aws_secret_access_key": "YOUR_SECRET_ACCESS_KEY"
986986
}
987987
}
988988
],

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
from setuptools import setup
22

33
setup(name = 'backups',
4-
version = '2.4.1',
4+
version = '2.5.0',
55
description = 'Data Backup Scripts',
66
author = 'Ross Golder',
77
author_email = 'ross@golder.org',

0 commit comments

Comments
 (0)