Skip to content
This repository was archived by the owner on Aug 9, 2023. It is now read-only.

Commit dbf14a9

Browse files
committed
Add docs Custom Deployment
1 parent 2fd1b1e commit dbf14a9

File tree

4 files changed

+68
-5
lines changed

4 files changed

+68
-5
lines changed

.travis.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,12 +31,12 @@ deploy:
3131
script: bash _scripts/deploy.sh --public --verbose production
3232
skip_cleanup: true
3333
on:
34-
repo: aws-samples/aws-genomics-workflows
34+
repo: $TRAVIS_REPO_SLUG
3535
branch: release
3636
tags: true
3737
- provider: script
3838
script: bash _scripts/deploy.sh --public --verbose test
3939
skip_cleanup: true
4040
on:
41-
repo: aws-samples/aws-genomics-workflows
41+
repo: $TRAVIS_REPO_SLUG
4242
branch: master
Lines changed: 63 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,68 @@
11
# Building Custom Resources
22

3-
This section describes how to build and upload customized templates and artifacts.
3+
This section describes how to build and upload templates and artifacts to use in a customized deployment. Once uploaded, the locations of the templates and artifacts are used when deploying the Nextflow on AWS Batch solution (see [Customized Deployment](custom-deploy.md))
44

55
## Building a Custom Distribution
66

7-
## Deploying a custom Distribution
7+
This step involves building a distribution of templates and artifacts from the solution's source code.
8+
9+
First, create a local clone of the [Genomics Workflows on AWS](https://github.com/aws-samples/aws-genomics-workflows) source code. The code base contains several directories:
10+
11+
* `_scripts/`: Shell scripts for building and uploading the customized distribution of templates and artifacts
12+
* `docs/`: Source code for the documentation, written in [MarkDown](https://markdownguide.org) for the [MkDocs](https://mkdocs.org) publishing platform. This documentation may be modified, expanded, and contributed in the same way as source code.
13+
* `src/`: Source code for the components of the solution:
14+
* `containers/`: CodeBuild buildspec files for building AWS-specific container images and pushing them to ECR
15+
* `_common/`
16+
* `build.sh`: A generic build script that first builds a base image for a container, then builds an AWS specific image
17+
* `entrypoint.aws.sh`: A generic entrypoint script that wraps a call to a binary tool in the container with handlers data staging from/to S3
18+
* `nextflow/`
19+
* `Dockerfile`
20+
* `nextflow.aws.sh`: Docker entrypoint script to execute the Nextflow workflow on AWS Batch
21+
* `ebs-autoscale/`
22+
* `get-amazon-ebs-autoscale.sh`: Script to retrieve and install [Amazon EBS Autoscale](https://github.com/awslabs/amazon-ebs-autoscale)
23+
* `ecs-additions/`: Scripts to be installed on ECS host instances to support the distribution
24+
* `awscli-shim.sh`: Installed as `/opt/aws-cli/bin/aws` and mounted onto the container, allows container images without full glibc to use the AWS CLI v2 through supplied shared libraries (especially libz) and `LD_LIBRARY_PATH`.
25+
* `ecs-additions-common.sh`: Utility script to install `fetch_and_run.sh`, Nextflow and Cromwell shims, and swap space
26+
* `ecs-additions-cromwell-linux2-worker.sh`:
27+
* `ecs-additions-cromwell.sh`:
28+
* `ecs-additions-nextflow.sh`:
29+
* `ecs-additions-step-functions.sh`:
30+
* `fetch_and_run.sh`: Uses AWS CLI to download and run scripts and zip files from S3
31+
* `provision.sh`: Appended to the userdata in the launch template created by [gwfcore-launch-template](custom-deploy.md): Starts SSM Agent, ECS Agent, Docker; runs `get-amazon-ebs-autoscale.sh`, `ecs-additions-common.sh` and orchestrator-specific `ecs-additions-` scripts.
32+
* `lambda/`: Lambda functions to create, modify or delete ECR registries or CodeBuild jobs
33+
* `templates/`: CloudFormation templates for the solution stack, as described in [Customized Deployment](custom-deploy.md)
34+
35+
## Deploying a Custom Distribution
36+
37+
The script `_scripts/deploy.sh` will create a custom distribution of artifacts and templates from files in the source tree, then upload this distribution to an S3 bucket. It will optionally also build and deploy a static documentation site from the Markdown documentation files. Its usage is:
38+
39+
```sh
40+
deploy.sh [--site-bucket BUCKET] [--asset-bucket BUCKET]
41+
[--asset-profile PROFILE] [--deploy-region REGION]
42+
[--public] [--verbose]
43+
STAGE
44+
45+
--site-bucket BUCKET Deploy documentation site to BUCKET
46+
--asset-bucket BUCKET Deploy assets to BUCKET
47+
--asset-profile PROFILE Use PROFILE for AWS CLI commands
48+
--deploy-region REGION Deploy in region REGION
49+
--public Deploy to public bucket with '--acl public-read' (Default false)
50+
--verbose Display more output
51+
STAGE 'test' or 'production'
52+
```
53+
54+
When running this script from the command line, use the value `test` for the stage. This will deploy the templates and artifacts into a directory `test` in your deployment bucket:
55+
56+
```
57+
$ aws s3 ls s3://my-deployment-bucket/test/
58+
PRE artifacts/
59+
PRE templates/
60+
```
61+
62+
Use these values when deploying a customized installation, as described in [Customized Deployment](custom-deploy.md), sections 'Artifacts and Nested Stacks' and 'Nextflow'. In the example from above, the values to use would be:
63+
64+
* Artifact S3 Bucket Name: `my-deployment-bucket`
65+
* Artifact S3 Prefix: `test/artifacts`
66+
* Template Root URL: `https://my-deployment-bucket.s3.amazonaws.com/test/templates`
67+
68+
The use of `production` for stage is reserved for deployments from a Travis CI/CD environment; this usage will deploy into a subdirectory named after the current release tag.

docs/core-env/custom-deploy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Customized Templates and Artifacts
1+
# Customized Deployment
22

33
Deployments of the 'Nextflow on AWS Batch' solution are based on nested CloudFormation templates, and on artifacts comprising scripts, software packages, and configuration files. The templates and artifacts are stored in S3 buckets, and their S3 URLs are used when launching the top-level template and as parameters to that template's deployment.
44

mkdocs.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59,3 +59,5 @@ extra:
5959
s3:
6060
bucket: docs.opendata.aws
6161
prefix: genomics-workflows
62+
63+
use_directory_urls: false

0 commit comments

Comments
 (0)