|
1 | 1 | # mavedb-api |
2 | 2 |
|
3 | 3 | API for MaveDB. MaveDB is a biological database for Multiplex Assays of Variant Effect (MAVE) datasets. |
4 | | -The API powers the MaveDB website at [mavedb.org](https://www.mavedb.org) and can also be called separately (see |
5 | | -instructions [below](#using-mavedb-api)). |
| 4 | +The API powers the MaveDB website at [mavedb.org](https://www.mavedb.org) and can also be called separately (see |
| 5 | +instructions [below](#using-mavedb-api)). |
6 | 6 |
|
7 | 7 |
|
8 | 8 | For more information about MaveDB or to cite MaveDB please refer to the |
@@ -44,56 +44,45 @@ The distribution can be uploaded to PyPI using [twine](https://twine.readthedocs |
44 | 44 | For use as a server, this distribution includes an optional set of dependencies, which are only invoked if the package |
45 | 45 | is installed with `pip install mavedb[server]`. |
46 | 46 |
|
47 | | -### Running the API server in Docker on production and test systems |
| 47 | +### Running a local version of the API server |
48 | 48 |
|
49 | 49 | First build the application's Docker image: |
50 | 50 | ``` |
51 | 51 | docker build --tag mavedb-api/mavedb-api . |
52 | 52 | ``` |
53 | 53 | Then start the application and its database: |
54 | 54 | ``` |
55 | | -docker-compose -f docker-compose-prod.yml up -d |
| 55 | +docker-compose -f docker-compose-local.yml up -d |
56 | 56 | ``` |
57 | 57 | Omit `-d` (daemon) if you want to run the application in your terminal session, for instance to see startup errors without having |
58 | 58 | to inspect the Docker container's log. |
59 | 59 |
|
60 | 60 | To stop the application when it is running as a daemon, run |
61 | 61 | ``` |
62 | | -docker-compose -f docker-compose-prod.yml down |
| 62 | +docker-compose -f docker-compose-local.yml down |
63 | 63 | ``` |
64 | 64 |
|
65 | | -`docker-compose-prod.yml` configures two containers: one for the API server and one for the PostgreSQL database. The |
66 | | -The database stores data in a Docker volume named `mavedb-data`, which will persist after running `docker-compose down`. |
| 65 | +`docker-compose-local.yml` configures four containers: one for the API server, one for the PostgreSQL database, one for the |
| 66 | +worker node and one for the Redis cache which acts as the job queue for the worker node. The worker node stores data in a Docker |
| 67 | +volume named `mavedb-redis` and the database stores data in a Docker volume named `mavedb-data`. Both these volumes will persist |
| 68 | +after running `docker-compose down`. |
67 | 69 |
|
68 | 70 | **Notes** |
69 | 71 | 1. The `mavedb-api` container requires the following environment variables, which are configured in |
70 | | - `docker-compose-prod.yml`: |
| 72 | + `docker-compose-local.yml`: |
71 | 73 |
|
72 | 74 | - DB_HOST |
73 | 75 | - DB_PORT |
74 | 76 | - DB_DATABASE_NAME |
75 | 77 | - DB_USERNAME |
76 | 78 | - DB_PASSWORD |
77 | 79 | - NCBI_API_KEY |
| 80 | + - REDIS_IP |
| 81 | + - REDIS_PORT |
78 | 82 |
|
79 | 83 | The database username and password should be edited for production deployments. `NCBI_API_KEY` will be removed in |
80 | 84 | the future. **TODO** Move these to an .env file. |
81 | 85 |
|
82 | | -2. In the procedure given above, we do not push the Docker image to a repository like Docker Hub; we simply build the |
83 | | - image on the machine where it will be used. But to deploy the API server on the AWS-hosted test site, first tag the |
84 | | - image appropriately and push it to Elastic Container Repository. (These commands require ) |
85 | | - ``` |
86 | | - export ECRPASSWORD=$(aws ecr get-login-password --region us-west-2 --profile mavedb-test) |
87 | | - echo $ECRPASSWORD | docker login --username AWS --password-stdin {aws_account_id}.dkr.ecr.us-west-2.amazonaws.com |
88 | | - docker tag mavedb-api:latest {aws_account_id}.dkr.ecr.us-west-2.amazonaws.com/mavedb-api |
89 | | - docker push {aws_account_id}.dkr.ecr.us-west-2.amazonaws.com/mavedb-api |
90 | | - ``` |
91 | | - These commands presuppose that you have the [AWS CLI](https://aws.amazon.com/cli/) installed and have created a named |
92 | | - profile, `mavedb-test`, with your AWS credentials. |
93 | | - |
94 | | - With the Docker image pushed to ECR, you can now deploy the application. **TODO** Add instructions if we want to |
95 | | - document this. |
96 | | - |
97 | 86 | ### Running the API server in Docker for development |
98 | 87 |
|
99 | 88 | A similar procedure can be followed to run the API server in development mode on your local machine. There are a couple |
@@ -134,3 +123,10 @@ Before using either of these methods, configure the environment variables descri |
134 | 123 |
|
135 | 124 | If you use PyCharm, the first method can be used in a Python run configuration, but the second method supports PyCharm's |
136 | 125 | FastAPI run configuration. |
| 126 | + |
| 127 | +### Running the API server for production |
| 128 | + |
| 129 | +We maintain deployment configuration options and steps within a [private repository](https://github.com/VariantEffect/mavedb-deployment) used for deploying this source code to |
| 130 | +the production MaveDB environment. The main difference between the production setup and these local setups is that |
| 131 | +the worker and api services are split into distinct environments, allowing them to scale up or down individually |
| 132 | +dependent on need. |
0 commit comments