This folder contains the main backend server for NavigaTUM.
For getting started, there are some system dependencies which you will need. Please follow the system dependencies docs before trying to run this part of our project.
We have a few API endpoints which require additional dependencies.
As a general rule of thumb, need to do the Database and MeiliSearch setup.
The preview endpoint is the only endpoint, which requires the tileserver.
Because of the data required for download and how non-essential this part is, it is only provided via the production
instance.
At the beginning of the main API we set up both meilisearch and the database. This will ensure that the sqlite database and meilisearch index is created.
This requires meilisearch to be online. To set up MeiliSearch, either follow their installation instructions or use
docker run -it --rm -p 7700:7700 getmeili/meilisearch:latestMeiliSearch provides an interactive interface at http://localhost:7700.
To set up the Postgis, run the following command:
docker run -it --rm -e POSTGRES_PASSWORD=CHANGE_ME -p 5432:5432 postgis/postgis:latestRun cargo run to start the server.
The server should now be available on localhost:8080 if you have configured the correct environment.
Note
cargo run --release is used to start the server for an optimised production build (use this if you want to profile
the search or preview functions, it makes quite a difference).
The server now serves all static files (images, maps, sitemaps, data files) directly at the /cdn endpoint, eliminating the need for a separate nginx CDN container.
The server can load setup data files from the local filesystem or download them from the CDN. This provides:
- Faster startup times in production (all files are baked into the Docker image)
- Self-contained deployment (no separate CDN container needed)
- Using local files during development, if available
- Fallback to downloading files during development when local files aren't available
The server looks for data files in the following locations (in order):
/cdn/(Docker production)data/output/(relative to current working directory)../data/output/(one level up - useful when running fromserver/directory)../../data/output/(two levels up)
If files are not found locally, they will be downloaded from the source specified by the CDN_URL environment variable (or GitHub for some files).
The following files are loaded during server setup:
alias_data.parquet- Alias mappings for locations (baked into Docker image)api_data.json- Main location data for the API (baked into Docker image)status_data.parquet- Status information for locations (baked into Docker image)search_data.json- Search index data for MeiliSearch (baked into Docker image)public_transport.parquet- Public transportation station data (baked into Docker image)
| variable | module | usage/description | |
|---|---|---|---|
POSTGRES_{USER,PASSWORD,URL,DB} |
all |
required | Used to connect to the db |
GIT_COMMIT_SHA |
main |
optional | Shown in the status endpint (also set at build time in docker) |
LOG_LEVEL |
main |
optional | Controlls what is being logged (default=info in release and debug in development mode) |
GITHUB_TOKEN |
feedback |
A GitHub token with write access to repo.This is used to create issues/PRs on the repository. |
|
JWT_KEY |
feedback |
A key used to sign JWTs. This is used to authenticate that feedback tokens were given out by us. |
|
MIELI_{URL,MASTER_KEY} |
search |
Allows searching via meiliserch | |
CDN_URL |
setup |
optional (fallback only) | Fallback URL for downloading data files if not found locally (usually not needed in production) |
For the database-connector we use sqlx.
Migrations can be run with the sqlx-cli tool. Said tool can be installed with:
cargo install sqlx-cliMigrations can be added using
cargo sqlx migrate add -r <migration-name>To get compiletime guarantees for our queries, we use sqlx. To add/edit a query, you will need to run the following command:
cargo sqlx migrate run --database-url postgres://postgres:CHANGE_ME@localhost:5432/postgres
cargo sqlx prepare --database-url postgres://postgres:CHANGE_ME@localhost:5432/postgresIf you have made changes to the API, you need to update the API documentation.
There are two editors for the API documentation (both are imperfect):
Of course documentation is one part of the process. Run API-Fuzz-Test and schemathesis on API Server to ensure specification is up-to-date and without holes. To do so, run the following commands against the API Server:
python -m venv venv
source venv/bin/activate
pip install schemathesis
st run --workers=auto --base-url=http://localhost:3003 --checks=all https://nav.tum.de/api/openapi.jsonSome fuzzing-goals may not be available for you locally, as they require prefix-routing (f.ex./cdn to the CDN) and
some fuzzing-goals are automatically tested in our CI.
You can exchange --base-url=http://localhost:3003 to --base-url=https://nav.tum.de for the full public API, or
restrict your scope using an option like --endpoint=/api/search.
Some of our tests are approval tests. Please install insta to have a working environment.
You can then run cargo insta test instead of cargo test to review the needed changes.
If you don't want to do this, using the version we provide via CI is fine, but the DX is way better with the correct
tooling.
This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program. If not, see https://www.gnu.org/licenses/.