Skip to content

Commit a2eca98

Browse files
readme update
1 parent c32f70e commit a2eca98

File tree

1 file changed

+19
-2
lines changed

1 file changed

+19
-2
lines changed

README.md

Lines changed: 19 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -128,8 +128,25 @@ cd hanerma
128128
pip install -e .
129129

130130
# 2. Configure Credentials (.env)
131-
HF_TOKEN="your_huggingface_token"
132-
HANERMA_MODEL="Qwen/Qwen3-Coder-Next-FP8:together"
131+
HANERMA_MODEL="hf/Qwen/Qwen3-Coder-Next-FP8:together" # Example default
132+
133+
# 3. Model Provider Configuration (Multi-Tenant)
134+
HANERMA supports three primary provider tiers. Use the prefixes below to route requests:
135+
136+
### ◈ Tier 1: Hugging Face (Cloud Hub)
137+
* **Prefix**: `hf/` or `huggingface/` (or any string containing `Qwen/` or `:`)
138+
* **Requirement**: `HF_TOKEN` in `.env`
139+
* **Example**: `hf/meta-llama/Llama-3.1-405B-Instruct`
140+
141+
### ◈ Tier 2: OpenRouter (Cloud Gateway)
142+
* **Prefix**: `openrouter/` or `gpt-` or `claude-`
143+
* **Requirement**: `OPENROUTER_API_KEY` in `.env`
144+
* **Example**: `openrouter/anthropic/claude-3.5-sonnet`
145+
146+
### ◈ Tier 3: Local Reasoning (Edge)
147+
* **Prefix**: `local-` (or no prefix)
148+
* **Requirement**: `OLLAMA_ENDPOINT` (Default: `http://localhost:11434/v1`)
149+
* **Example**: `local-llama3.1`
133150
```
134151

135152
## 📜 License

0 commit comments

Comments
 (0)