Skip to content

Commit 1ac623b

Browse files
committed
doc: added docs for performance testing
1 parent 19eb99b commit 1ac623b

File tree

2 files changed

+81
-0
lines changed

2 files changed

+81
-0
lines changed

docs/performance.md

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
# Executing Performance Tests
2+
3+
This repository includes tools to execute performance tests for the **APEx Dispatch API**.
4+
Performance testing is useful for analyzing the impact of code changes, database updates, and platform modifications on the system's behavior and responsiveness.
5+
6+
## Prerequisites
7+
8+
Before running the performance tests, ensure the following prerequisites are met:
9+
10+
* **Python environment** (Python 3.10+ recommended)
11+
* **Docker** and **Docker Compose** installed on your system
12+
13+
## Setting Up the Environment
14+
15+
Performance tests require both the API and a database to be running locally. Follow these steps to set up your environment:
16+
17+
1. **Create a `.env` file** in the root of the repository with the following variables:
18+
19+
* `OPENEO_BACKENDS_PERFORMANCE` → Configuration for the openEO backend authentication. See [configuration guide](./configuration.md#openeo-backend-configuration).
20+
* `KEYCLOAK_CLIENT_PERFORMANCE_ID` → Client ID used for executing performance tests.
21+
* `KEYCLOAK_CLIENT_PERFORMANCE_SECRET` → Client secret used for executing performance tests.
22+
23+
2. **Start the services using Docker Compose**:
24+
25+
```bash
26+
docker compose -f docker-compose.perf.yml up -d db
27+
```
28+
29+
Starts a local database instance.
30+
31+
```bash
32+
docker compose -f docker-compose.perf.yml up -d migrate
33+
```
34+
35+
Executes database migrations to ensure all required tables are created.
36+
37+
```bash
38+
docker compose -f docker-compose.perf.yml up -d app
39+
```
40+
41+
Starts the API locally.
42+
43+
> **Tip:** You can check the logs of each service with `docker compose -f docker-compose.perf.yml logs -f <service_name>`.
44+
45+
## Executing Performance Tests
46+
47+
The performance tests are implemented using **[Locust](https://locust.io/)**. Test scenarios are located in `tests/performance/locustfile.py`.
48+
49+
### Running Tests with a Web Dashboard
50+
51+
To execute the performance tests and monitor them in a browser dashboard:
52+
53+
```bash
54+
locust -f tests/performance/locustfile.py -u 10 --host http://localhost:8000 --run-time 1m
55+
```
56+
57+
* `-u 10` → Number of simulated concurrent users
58+
* `--host http://localhost:8000` → URL of the API to test
59+
* `--run-time 1m` → Duration of the test
60+
61+
After starting, open your browser at [http://localhost:8089](http://localhost:8089) to monitor real-time performance metrics, including response times, failure rates, and throughput.
62+
63+
### Running Tests in Headless Mode
64+
65+
To execute tests without a web interface (useful for CI/CD pipelines):
66+
67+
```bash
68+
locust -f tests/performance/locustfile.py -u 10 --host http://localhost:8000 --run-time 1m --headless
69+
```
70+
71+
You can also export the results to a CSV file for further analysis:
72+
73+
```bash
74+
locust -f tests/performance/locustfile.py -u 10 --host http://localhost:8000 --run-time 1m --headless --csv=perf_test_results
75+
```
76+
77+
### Recommended Practices
78+
79+
* Start with a small number of users to validate test scripts before scaling up.
80+
* Combine performance testing with monitoring tools to detect resource bottlenecks.

mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ nav:
1010
- Home: index.md
1111
- Getting Started: getting_started.md
1212
- Configuration: configuration.md
13+
- Performance Testing: performance.md
1314
- Contributing: contributing.md
1415
- Architecture: architecture.md
1516

0 commit comments

Comments
 (0)