Skip to content

DHTI CLI

Bell Eapen edited this page Jan 17, 2026 · 2 revisions

DHTI CLI

The DHTI CLI (dhti-cli) is your command-line swiss-army knife for composing, installing, developing, packaging, and managing DHTI modules.

Installation

You don't need to install globally. Use npx so you always run the latest published version:

npx dhti-cli help

Core Concepts

  • Compose: Build a docker-compose file with selected modules.
  • Elixir: A LangServe app providing GenAI functionality.
  • Conch: An OpenMRS O3 ESM (or CDS Hooks client) providing UI & interaction.
  • Bootstrap: Runtime configuration (models, hyperparameters, etc.) injected without rebuilding.

Commands Overview

For detailed usage, see the specific pages:

Compose Commands

  • npx dhti-cli compose add -m openmrs -m langserve [ ... ] Adds modules to ~/dhti/docker-compose.yml.
  • npx dhti-cli compose read Displays the generated compose file.

Elixir Commands

  • npx dhti-cli elixir install -g <git_url> -n <name> Install an elixir from Git.
  • npx dhti-cli elixir dev -d <path_to_elixir_src> -n <name> -c <container> Sync code (hot-reload like) into running container.
  • npx dhti-cli elixir init Initialize new project.

Conch Commands

  • npx dhti-cli conch install -g <git_url> -n <name> Install a conch (OpenMRS O3 ESM or CDS client).
  • npx dhti-cli conch dev -d <path_to_conch_dist> -n <name> -c <container> Copy built dist into running frontend container.
  • npx dhti-cli conch start -n <name> Start a conch.

Docker Commands

  • npx dhti-cli docker -u Start (up) all services in compose.
  • npx dhti-cli docker -d Stop & remove containers.
  • npx dhti-cli docker -n <image_name> -t elixir|conch Build an image from installed module.
  • npx dhti-cli docker bootstrap -f <bootstrap.py> -c <container> Inject new bootstrap config.

Other Commands

  • docktor: Manage ML inference containers. See DOCKTOR.
  • synthetic: Generate synthetic data using LLM prompts. See Synthetic Data.
  • mimic: Submit FHIR requests/tests.
  • plugins: Manage CLI plugins.

Typical Workflow

  1. Create compose: npx dhti-cli compose add -m openmrs -m langserve -m redis -m ollama
  2. Install elixir template: npx dhti-cli elixir install -g https://github.com/dermatologist/dhti-elixir-template.git -n dhti-elixir-template
  3. Install conch template: npx dhti-cli conch install -g https://github.com/dermatologist/openmrs-esm-dhti-template.git -n openmrs-esm-dhti-template
  4. Start stack: npx dhti-cli docker -u
  5. Develop elixir with hot reload: npx dhti-cli elixir dev -d ../dhti-elixir-template -n dhti-elixir-template -c dhti-langserve-1
  6. Develop conch UI: npx dhti-cli conch dev -d ../openmrs-esm-dhti-template -n openmrs-esm-dhti-template -c dhti-frontend-1
  7. Build images to share: npx dhti-cli docker -n yourhandle/genai-elixir:1.0 -t elixir and corresponding conch.

Modules Catalog

Module Purpose
openmrs EMR (patient records, UI host)
langserve Elixir runtime (serves LangChain apps)
redis Vector store for RAG
ollama Local LLM hosting
langfuse Observability & tracing
neo4j Graph analytics & relationships
cqlFhir HAPI FHIR server with CQL support
mcpFhir MCP tool server exposing FHIR utilities
mcpx MCP Gateway for connecting tools
docktor Traditional ML model containers

Tips

  • Use small, focused elixirs—each with a single responsibility.
  • Keep bootstrap overrides versioned so experiments are reproducible.
  • Prefer semantic version tags for built images (1.0.0, 1.1.0, etc.).
  • Share elixirs & conches via Git to promote reuse.

Troubleshooting

Issue Suggestion
Container fails to start Run docker logs <container>; check port conflicts.
Elixir not updated after dev sync Confirm path correctness & no dependency changes requiring rebuild.
Conch assets not visible Clear browser cache / ensure correct container name.
Ollama model missing docker exec -it dhti-ollama-1 ollama pull <model> then update bootstrap.

Next Steps

See Getting Started for a guided quickstart and Architecture for system diagram.


CLI accelerates ideas → validated prototypes → production handoff.

Clone this wiki locally