Skip to content

Commit 182a4f7

Browse files
Merge pull request #4 from nsidc/add-automated-testing
Add automated testing
2 parents a87b8e7 + 776502b commit 182a4f7

24 files changed

+797
-474
lines changed

.github/workflows/pre-commit.yml

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
name: pre-commit
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches: [main]
7+
8+
jobs:
9+
pre-commit:
10+
runs-on: ubuntu-latest
11+
steps:
12+
- uses: actions/checkout@v3
13+
- uses: actions/setup-python@v3
14+
- uses: pre-commit/action@v3.0.1

.github/workflows/test.yml

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
name: tests
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches:
7+
- main
8+
9+
jobs:
10+
test:
11+
name: "Run backend tests"
12+
runs-on: "ubuntu-latest"
13+
steps:
14+
- name: "Check out repository"
15+
uses: "actions/checkout@v4"
16+
17+
- name: "Build container image"
18+
run: |
19+
docker build --tag "test" .
20+
21+
# mypy
22+
- name: "Run mypy"
23+
run: "docker run test mypy dat_backend/ test/"
24+
25+
# Unit tests
26+
- name: "Run unit tests"
27+
run: "docker run test pytest test/unit"

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
11
**/__pycache__/
2-
nginx/logs/*
2+
nginx/logs/*

.pre-commit-config.yaml

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
default_language_version:
2+
python: "python3.12"
3+
4+
repos:
5+
- repo: "https://github.com/pre-commit/pre-commit-hooks"
6+
rev: "v5.0.0"
7+
hooks:
8+
- id: "check-added-large-files"
9+
- id: "check-vcs-permalinks"
10+
- id: "end-of-file-fixer"
11+
12+
- repo: "https://github.com/charliermarsh/ruff-pre-commit"
13+
rev: "v0.9.9"
14+
hooks:
15+
- id: "ruff"
16+
# NOTE: "--exit-non-zero-on-fix" is important for CI to function
17+
# correctly!
18+
args:
19+
["--fix", "--exit-non-zero-on-fix", "--verbose", "--line-length=88"]
20+
21+
- repo: "https://github.com/psf/black"
22+
rev: "25.1.0"
23+
hooks:
24+
- id: "black"
25+
26+
- repo: "https://github.com/jendrikseipp/vulture"
27+
rev: "v2.14"
28+
hooks:
29+
- id: "vulture"
30+
args:
31+
# Ignore flask routes that we define with the `@api.route` decorator.
32+
- "--ignore-decorators"
33+
- "@api.route"
34+
# Ignore names of attrs in a ctx object in python_script.py that
35+
# vulture incorrectly believes are unused.
36+
- "--ignore-names"
37+
- "check_hostname,verify_mode"
38+
39+
- repo: https://github.com/codespell-project/codespell
40+
rev: "v2.4.1"
41+
hooks:
42+
- id: codespell
43+
44+
- repo: https://github.com/rbubley/mirrors-prettier
45+
rev: "v3.5.2"
46+
hooks:
47+
- id: prettier
48+
types_or: [yaml, markdown, html, css, scss, javascript, json]
49+
args: [--prose-wrap=always]

Dockerfile

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,14 @@ COPY --chown=$MAMBA_USER:$MAMBA_USER environment.yml /tmp/env.yaml
55
RUN micromamba install -y -n base -f /tmp/env.yaml && \
66
micromamba clean --all --yes
77

8-
COPY src/* .
8+
COPY src/dat_backend/ ./dat_backend/
9+
COPY test/ ./test/
10+
COPY pyproject.toml .
911

1012
EXPOSE 5000
1113

1214
# TODO: handle ssl w/ gunicorn? Self-signed cert on this image easiest?
1315
# CMD /bin/bash -c "gunicorn --bind 0.0.0.0:5000 --workers 3 app:app"
1416

1517
# TODO this might not work
16-
CMD /bin/bash -c "PYTHONPATH=./ python dat_backend/app.py"
18+
CMD /bin/bash -c "PYTHONPATH=./ python dat_backend/app.py"

README.md

Lines changed: 52 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,68 @@
11
# Data Access Tool (DAT) Backend
22

3-
Backend services for the [Data Access Tool UI](https://github.com/nsidc/data-access-tool-ui).
3+
Backend services for the
4+
[Data Access Tool UI](https://github.com/nsidc/data-access-tool-ui).
45

56
A Flask-based API provides:
67

7-
* The`python_script.py` download script for data granules matching a user's filters.
8-
* The
8+
- The`python_script.py` download script for data granules matching a user's
9+
filters.
10+
- The
911
[getLinks](https://github.com/nasa/earthdata-download/blob/main/docs/GET_LINKS.md)
10-
service required for DAT integration with the [NASA Earthdata
11-
Downloader](https://github.com/nasa/earthdata-download).
12-
12+
service required for DAT integration with the
13+
[NASA Earthdata Downloader](https://github.com/nasa/earthdata-download).
1314

1415
Note that this service was originally a part of the
1516
[hermes-api](https://bitbucket.org/nsidc/hermes-api/src). It was moved to a
16-
standalone service to support the decomissioning of ECS and the rest of the
17+
standalone service to support the decommissioning of ECS and the rest of the
1718
hermes stack planned for July 2026.
1819

19-
2020
## Dev
2121

22-
### Testing the download script
22+
### Pre-commit
23+
24+
This project uses [pre-commit](https://pre-commit.com/) to run pre-commit hooks
25+
that check and format this project's code for stylistic consistency (e.g., using
26+
`ruff` and `black`) .
27+
28+
The pre-commit configuration for this project can be found in
29+
`.pre-commit-config.yaml`. Configuration for specific tools (e.g., `vulture`) is
30+
given in the included `pyproject.toml`.
31+
32+
For more information about using `pre-commit`, please sese the
33+
[Scientific Python Library Development Guide's section on pre-commit](https://learn.scientific-python.org/development/guides/gha-basic/#pre-commit).
34+
35+
To install pre-commit to run checks for each commit you make:
36+
37+
```
38+
$ pre-commit install
39+
```
40+
41+
To manually run the pre-commit hooks without a commit:
42+
43+
```
44+
$ pre-commit run --all-files
45+
```
46+
47+
> [!NOTE] GitHub actions are configured to run pre-commit for all PRs and pushes
48+
> to the `main` branch. See
49+
> [.github/workflows/pre-commit.yml](.github/workflows/pre-commit.yml).
2350
24-
The tests for the download script assumes a .netrc file is setup for the current
25-
user to login to earthdata. Setup a `.netrc` with credentials for earthdata
26-
login by e.g., [using the earthaccess
27-
library](https://earthaccess.readthedocs.io/en/latest/howto/authenticate/)
51+
### Running tests
2852

29-
Once a `.netrc` file is setup:
53+
Before manually running tests, setup the `EARTHDATA_USERNAME` and
54+
`EARTHDATA_PASSWORD` envvars, which are necessary for integration tests.
55+
56+
Next, to run all tests:
3057

3158
```
32-
PYTHONPATH=./src/ pytest test/
59+
scripts/run_tests.sh
3360
```
3461

62+
> [!NOTE] GitHub actions are configured to run unit tests that do not require
63+
> Earthdata login credentials for all PRs and pushes to the `main` branch. See
64+
> [.github/workflows/test.yml](.github/workflows/test.yml).
65+
3566
### Testing the EDD integration
3667

3768
An example deep-link to initiate EDD downloads:
@@ -46,16 +77,14 @@ that looks like the above.
4677
The GET request to `earthdata-download://startDownload` should include the
4778
following query parameters:
4879

49-
* `getLinks`: URI for `/api/get-links/`. This URI will specify the
80+
- `getLinks`: URI for `/api/get-links/`. This URI will specify the
5081
`cmr_request_params` query-parameter, which is a string representing the CMR
5182
query parameters mapping to a user's selections in the DAT.
52-
* `downloadId`: The dataset ID and version for the current order (e.g., ATL06 v6
83+
- `downloadId`: The dataset ID and version for the current order (e.g., ATL06 v6
5384
is `atl06_06`)
54-
* `authUrl`: URI for `/api/earthdata/auth/`. EDD will
55-
use this to initiate a token exchange with URS to authenticate user
56-
downloads. This URL must include
85+
- `authUrl`: URI for `/api/earthdata/auth/`. EDD will use this to initiate a
86+
token exchange with URS to authenticate user downloads. This URL must include
5787
`eddRedirect=earthdata-download%3A%2F%2FauthCallback` as a query parameter.
5888

59-
> [!WARNING]
60-
> As of this writing, the CMR query parameters are hard-coded to always return a
61-
> small subset of ATL06 v6 data.
89+
> [!WARNING] As of this writing, the CMR query parameters are hard-coded to
90+
> always return a small subset of ATL06 v6 data.

docker-compose-test.yml

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
services:
2+
# TODO: rename this.
3+
downloader-script:
4+
# TODO: dev stuff extracted to dev env yaml
5+
build: .
6+
volumes:
7+
- "./src/dat_backend:/tmp/dat_backend/"
8+
- "./test:/tmp/test/"
9+
command: 'bash -c "mypy dat_backend/ test/ && pytest test/"'
10+
environment:
11+
- EARTHDATA_USERNAME=${EARTHDATA_USERNAME:?EARTHDATA_USERNAME must be set}
12+
- EARTHDATA_PASSWORD=${EARTHDATA_PASSWORD:?EARTHDATA_PASSWORD must be set}

environment.yml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,6 @@ dependencies:
1313

1414
# Dev deps
1515
- pytest ~=8.2
16-
- pyopenssl # required for test server w/ ssl
16+
- pyopenssl # required for test server w/ ssl
17+
- pre-commit ~=4.1
18+
- mypy ~=1.15

nginx/Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,4 +12,4 @@ RUN useradd --uid 1000 --user-group vagrant && \
1212
touch /var/run/nginx.pid && \
1313
chown -R vagrant:vagrant /var/run/nginx.pid && \
1414
chown -R vagrant:vagrant /var/cache/nginx && \
15-
chown -R vagrant:vagrant /etc/nginx
15+
chown -R vagrant:vagrant /etc/nginx

nginx/dat.conf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,4 +46,4 @@ server {
4646
proxy_set_header X-Script-Name $http_x_script_name;
4747
}
4848

49-
}
49+
}

0 commit comments

Comments
 (0)