-
Notifications
You must be signed in to change notification settings - Fork 0
Getting Started
DHTI is a modular reference architecture for bringing Generative AI into healthcare workflows — a quick prototyping platform that wires together an EMR (OpenMRS), a GenAI app server (LangServe), optional self‑hosted LLMs (Ollama), a vector store (Redis), monitoring (LangFuse), an MCP tool server, and supporting utilities (Neo4j, HAPI/CQL) so you can develop, test and share “elixirs” (LangServe apps) and “conches” (OpenMRS O3 UI modules or CDS-hook containers) without reengineering your stack. Clone the repo, then install the two required prerequisites: Node.js (LTS recommended) and Docker. Optionally install Python 3.9+ if you plan to author LangServe elixirs or run the examples. The base class provides you with everything to get started!
| Required | Optional |
|---|---|
| Node.js (LTS) | Python 3.9+ for custom elixirs |
| Docker Engine / Desktop |
uv, cookiecutter for scaffolding |
| AI Agent (Optional) | Claude Desktop or similar for vibe-coding |
No API keys needed for the demo (uses a mock LLM).
git clone https://github.com/dermatologist/dhti.git
cd dhti
./demo.shIf you have an AI agent (like Claude Desktop) with the start-dhti skill, you can scaffold and start the entire stack by simply asking:
"Start a new DHTI project with a healthcare chatbot."
npx dhti-cli help # list commands
npx dhti-cli compose add -m langserve # add LangServe to ~/dhti/docker-compose.yml
npx dhti-cli compose read # view generated compose
npx dhti-cli elixir install -g https://github.com/dermatologist/dhti-elixir.git -n dhti-elixir-schat -s packages/simple_chat
npx dhti-cli docker -n yourdockerhandle/genai-test:1.0 -t elixir
npx dhti-cli docker -u # start services from composeVisit http://localhost/openmrs/spa/home (admin / Admin123) to see the conch.
Gemini, OpenAI and OpenRouter will be picked up if the respective environment variables are set. Configure LLM by editing ~/dhti/elixir/app/bootstrap.py (model name, provider, temperature, etc.). Ollama models can be pulled into the Ollama container and referenced here.
See Hot Reload for copying source/dist into running containers for rapid feedback.
Stop and remove all containers:
npx dhti-cli docker -d- Explore Architecture for system internals.
- Build new elixirs: Scaffolding an Elixir.
- Browse community modules: List of open-source elixirs and List of open-source conchs.
- Try advanced features: DOCKTOR, Medplum, Synthetic Data, and MCP Agent.
