A container-first Google Cloud Storage (GCS) uploader designed to run entirely inside a Docker environment.
This project provides:
- A Python CLI tool to recursively upload files to GCS
- Full support for endpoint override (local emulator or real GCP)
- Docker + Docker Compose setup
- Makefile automation
- Zero dependency on host Python or local SDKs
The storage backend can be: - A local Fake GCS Server (for development) - Real Google Cloud Storage (production)
akoflow-driver-gcs-uploader/
│
├── docker/
│ └── local/
│ ├── .env
│ ├── docker-compose.yml
│ └── Dockerfile
├── upload_and_move.py
├── Dockerfile
├── docker-compose.yaml
├── Makefile
├── requirements.txt
└── README.md
- Docker
- Docker Compose
No local Python installation is required.
The upload_and_move.py script:
- Recursively scans a directory (relative to CWD)
- Excludes configurable directory names
- Uploads files to a GCS bucket
- Supports bucket creation
- Supports dry-run mode
- Supports custom storage endpoint override
All execution happens inside the app container.
The project uses an .env file via Docker Compose.
Example:
GOOGLE_CLOUD_PROJECT=local-project
For real GCP usage:
GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account.json
GOOGLE_CLOUD_PROJECT=your-project-id
The system supports endpoint override:
STORAGE_ENDPOINT=http://gcs-emulator:4443
If STORAGE_ENDPOINT is empty, the script connects to real Google Cloud
Storage.
make build
make up
make down
--bucket Destination bucket (required)
--prefix Object prefix
--source-dir Directory to scan (default: .)
--project Project override
--credentials Service account JSON path
--storage-endpoint Storage endpoint override
--exclude-dir Directory exclusion (repeatable)
--create-bucket Create bucket if missing
--dry-run Preview upload only
Run uploader:
make run GCS_BUCKET=my-bucket
Dry run:
make run-dry GCS_BUCKET=my-bucket
Create bucket:
make create-bucket GCS_BUCKET=my-bucket
List buckets:
make list-buckets
List objects:
make list-objects GCS_BUCKET=my-bucket
Clean bucket:
make clean-bucket GCS_BUCKET=my-bucket
To use real Google Cloud Storage:
- Remove or unset
STORAGE_ENDPOINT - Provide valid credentials via
GOOGLE_APPLICATION_CREDENTIALS - Set the correct
GOOGLE_CLOUD_PROJECT
Example:
make run GCS_BUCKET=my-bucket STORAGE_ENDPOINT=
MIT