This is the sample coding challenge project for BL. This document describes the challenge, process and solution in order to properly execute the challenge presented by the team at BL.
This solution will cover the following scenarios:
- Terraform to create an AWS bucket
- Upload content to this bucket
- Create a tool to enumerate the data in the bucket
- Use Docker to run the tool
- Health Monitoring - Future
This one needs Work - I've got to move my TF data into vars
vars.tf
Line 4: Bucket Region Line 10: Name of your TF State Bucket Line 16: Name of your Files Bucket
main.tf
Line 2: AWS Region Line 5: TF State Bucket Line 19: Files Bucket
-
Make sure that your AWS environment is setup first
sh $ aws configure -
Make your necessary changes (above) for Terraform Variables.
-
Execute deploy.sh
sh $./deploy.sh
The steps to execute these scripts are as follow:
NOTE: This is reproducable on Ubuntu 20.04
- Inside of scripts folder execute:
./run-me-first.sh
- Setup your AWS credentials
aws configure
- Run your Terraform
../terraform/scripts/deploy.sh
- Build your test files
python3 upload.py
- Slip into the docker folder and build out your docker image
docker build --pull --rm -f Dockerfile -t jtest:demo .
- Run the Docker Image
docker-compose up
- Pull your files Your Volumes 'should' be located in: docker_mydata You can verify this with: docker volume ls If this is indeed the name of your docker volumn, declared in your compose file. Than you will need to locate your location for this by doing the following:
docker volume inspect docker_mydata
Navigate to the "MountPoint" described in the results. In this example it is: /var/lib/docker/volumes/docker_mydata/_data You will be presented with 3 files:
- filecount.txt - A total count of files in the S3 Bucket
- filelist.txt - A list of files inside of the S3 bucket
- list.py - The script that was executed
- Delete your test files and clear your bucket data From inside of your scripts directory, execute the below command. This will delete your test files on your local machine and purge your bucket data.
python3 purge.py
- Remove your Terraform Buckets Navigate to: /terraform/scripts/ folder. From here you will want to destroy your s3 Buckets that was created at the beginning of this execercise.
./destroy.sh
As you can see in Step's 6 and 7, I was preparing for more steps moving forward
- Moving this solution into a dockerized, always-on daemonized solution via docker-compose
- Using the 'filecount.txt' to check the number of files (should be >= our initial load) towards a health monitor with healthz or other docker health monitoring solutions
Insert References Here