A lightweight, containerized CLI utility for recursively uploading files to Amazon S3 (or LocalStack S3) with flexible configuration via CLI arguments and environment variables.
This project is designed for reproducible execution inside Docker, making it suitable for local development, CI pipelines, and technical demonstrations.
akoflow-driver-s3-uploader recursively scans a specified directory and uploads all eligible files to a destination S3 bucket.
- Recursive file discovery
- Configurable source directory
- Custom S3 prefix support
- Environment variable fallback for credentials
- LocalStack compatibility
- Dockerized execution
- Dry-run mode for safe previews
- Deterministic, container-based runtime
akoflow-driver-s3-uploader/
│
├── docker/
│ └── local/
│ ├── .env
│ ├── docker-compose.yml
│ └── Dockerfile
│
├── venv/
│
├── .env
├── docker-compose.yml
├── Dockerfile
├── Makefile
├── README.md
├── requirements.txt
└── upload_and_move.py
- The script resolves the source directory relative to the current working directory.
- It recursively scans all files.
- Excluded directories are ignored.
- S3 object keys are constructed using an optional prefix.
- Files are uploaded using
boto3. - A summary report is printed.
--bucket
Destination S3 bucket name.
-
--prefix
S3 key prefix (folder path inside the bucket).
Example:uploads/ -
--source-dir
Directory to scan recursively.
Resolved relative to the current working directory.
Default:. -
--endpoint-url
Custom S3 endpoint (e.g., LocalStack).
Example:http://localstack:4566 -
--region
AWS region. -
--access-key
AWS access key ID. -
--secret-key
AWS secret access key. -
--session-token
AWS session token (temporary credentials). -
--exclude-dir
Directory name to exclude (repeatable).
Example:--exclude-dir .git --exclude-dir node_modules
-
--dry-run
Preview mode. Prints what would be uploaded without performing uploads.
If CLI credentials are not provided, the script reads:
AWS_ENDPOINT_URLAWS_DEFAULT_REGIONAWS_ACCESS_KEY_IDAWS_SECRET_ACCESS_KEYAWS_SESSION_TOKEN
If none are set, boto3 falls back to its default credential provider chain:
- Environment variables
~/.aws/credentials- IAM role (if running inside AWS)
The following directories are excluded by default:
.git
.hg
.svn
__pycache__
.pytest_cache
.mypy_cache
.ruff_cache
.venv
venv
node_modules
.DS_Store
Additional exclusions can be added using --exclude-dir.
make buildmake upmake init S3_BUCKET=my-bucketmake run S3_BUCKET=my-bucket S3_PREFIX=uploads/ SOURCE_DIR=./datamake run-dry S3_BUCKET=my-bucket SOURCE_DIR=./datadocker exec -it aws-uploader python upload_and_move.py \
--bucket my-bucket \
--prefix uploads/ \
--source-dir ./dataDefault endpoint:
http://localstack:4566
Create bucket:
make create-bucket S3_BUCKET=my-bucketList objects:
make list-objects S3_BUCKET=my-bucketClean bucket:
make clean-bucket S3_BUCKET=my-bucketWhen --dry-run is enabled:
- Files are scanned
- S3 keys are generated
- Upload operations are printed
- No files are uploaded
- No network transfer occurs
This is useful for:
- Verifying key structure
- Validating exclusions
- Preventing accidental uploads
- Testing CI/CD pipelines safely
The script:
- Uses boto3 retry configuration (standard mode)
- Reports individual upload failures
- Returns exit code
1if any upload fails - Returns exit code
0on full success - Returns exit code
2if the source directory is invalid
| Target | Description |
|---|---|
| build | Build Docker images |
| up | Start environment |
| down | Stop environment |
| restart | Restart environment |
| logs | Follow logs |
| shell | Open app container shell |
| localstack | Open LocalStack shell |
| create-bucket | Create S3 bucket |
| list-buckets | List buckets |
| list-objects | List bucket objects |
| run | Execute uploader |
| run-dry | Execute in dry-run mode |
| clean-bucket | Remove all bucket objects |
| clean | Stop and remove volumes |
- Docker
- Docker Compose
- Python 3.x (inside container runtime)
Python dependencies:
boto3
awscli
awscli-local
- Local file synchronization to S3
- CI artifact uploads
- LocalStack integration testing
- Educational and technical article demonstrations
- Infrastructure automation experiments
MIT License.
akoflow-driver-s3-uploader