Find information in munich's political information system RIS with the help of ai.
The quickest way to try RIS-KI locally is to start the shared database with Compose, seed it with example data via the extractor, and (optionally) run the document pipeline.
- Docker or Podman (the repo ships a root-level
compose.yaml) - Python toolchain (we recommend
uv) - A populated
.envin the repo root (see.env.examplefor the variables used by all services)
One-time setup (required before the first backend start) to create the Kafka topics:
podman compose --profile init up topic-initFrom the repository root you can start everything (DB, Adminer, Kafka, backend, gateway, frontend) with:
podman compose up -dDefaults for Postgres (from compose.yaml): user postgres, password password, database example_db, exposed on 5432.
To populate the database with sample data from Munich's RIS and process documents with OCR, run the initialization services:
podman compose --profile init upThis will:
- Create necessary Kafka topics
- Extract sample entities and files from the configured RIS endpoint (honors your
.env, e.g., date range and base URL) - Run OCR on documents and store extracted markdown (configure OCR variables like
RISKI__DOCUMENTS__MAX_DOCUMENTS_TO_PROCESSandRISKI__DOCUMENTS__OCR_MODEL_NAMEin your.env)
Once the stack is running, visit:
- Frontend: http://localhost:8083/ (via the refarch gateway)
- Database UI: http://localhost:8080 (Adminer for inspecting database contents)
See the open issues for a full list of proposed features (and known issues).
Use the tag-version.ps1 helper to create semantic tags that trigger the
GitHub Actions workflows responsible for building the container images.
./tag-version.ps1- Select the service (
backend,extractor,frontend, ordocument-pipeline). - Choose the version bump (
major,minor, orpatch). - When prompted, decide whether the script should also bump the detected manifest (for example
riski-backend/pyproject.toml,riski-extractor/pyproject.toml,riski-document-pipeline/pyproject.toml, orriski-frontend/package.json) to the same version. This keeps the package metadata, container tags, and badges in sync. - Confirm the suggested tag (for example
backend-1.2.0,extractor-1.2.0,document-pipeline-1.2.0, orriski-frontend-1.2.0). - Confirm pushing the tag to
originto start the corresponding Docker release workflow.
After a successful push, the workflow builds and publishes the image to GitHub Container Registry.
The Python services (riski-backend, riski-document-pipeline, riski-extractor) import the local riski-core package via path dependencies.
Whenever you change riski-core, rerun the dependency install inside each consumer so uv rebuilds the local package:
uv sync --reinstall-package coreRun the command from the respective service folder (for example riski-backend) so uv picks up the correct pyproject.toml.
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please open an issue with the tag "enhancement", fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Open an issue with the tag "enhancement"
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
More about this in the CODE_OF_CONDUCT file.
Distributed under the MIT License. See LICENSE file for more information.
it@M - kicc@muenchen.de