You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Optional: Add this if you want to use OpenRouter models
65
67
OPENROUTER_API_KEY=your_openrouter_api_key
68
+
# Optional: Add Ollama host if not local. default: http://localhost:11434
69
+
OLLAMA_HOST=your_ollama_host
66
70
```
67
71
68
72
#### Step 2: Start the Backend
@@ -190,6 +194,9 @@ OPENROUTER_API_KEY=your_openrouter_api_key # Required for OpenRouter models
190
194
# OpenAI API Base URL Configuration
191
195
OPENAI_BASE_URL=https://custom-api-endpoint.com/v1 # Optional, for custom OpenAI API endpoints
192
196
197
+
# Ollama host
198
+
OLLAMA_HOST=your_ollama_host # Optional, if Ollama is not local. default: http://localhost:11434
199
+
193
200
# Configuration Directory
194
201
DEEPWIKI_CONFIG_DIR=/path/to/custom/config/dir # Optional, for custom config file location
195
202
```
@@ -238,13 +245,14 @@ The OpenAI Client's base_url configuration is designed primarily for enterprise
238
245
239
246
### Environment Variables
240
247
241
-
| Variable | Description | Required | Note |
242
-
|----------|-------------|----------|------|
243
-
| `GOOGLE_API_KEY` | Google Gemini API key for AI generation | No | Required only if you want to use Google Gemini models
244
-
|`OPENAI_API_KEY`| OpenAI API key for embeddings | Yes | Note: This is required even if you're not using OpenAI models, as it's used for embeddings. |
245
-
|`OPENROUTER_API_KEY`| OpenRouter API key for alternative models | No | Required only if you want to use OpenRouter models |
246
-
|`PORT`| Port for the API server (default: 8001) | No | If you host API and frontend on the same machine, make sure change port of `SERVER_BASE_URL` accordingly |
247
-
|`SERVER_BASE_URL`| Base URL for the API server (default: http://localhost:8001)| No |
| `GOOGLE_API_KEY` | Google Gemini API key for AI generation | No | Required only if you want to use Google Gemini models
251
+
|`OPENAI_API_KEY`| OpenAI API key for embeddings | Yes | Note: This is required even if you're not using OpenAI models, as it's used for embeddings. |
252
+
|`OPENROUTER_API_KEY`| OpenRouter API key for alternative models | No | Required only if you want to use OpenRouter models |
253
+
|`OLLAMA_HOST`| Ollama Host (default: http://localhost:11434)| No | Required only if you want to use external Ollama server |
254
+
|`PORT`| Port for the API server (default: 8001) | No | If you host API and frontend on the same machine, make sure change port of `SERVER_BASE_URL` accordingly |
255
+
|`SERVER_BASE_URL`| Base URL for the API server (default: http://localhost:8001)| No |
248
256
249
257
If you're not using ollama mode, you need to configure an OpenAI API key for embeddings. Other API keys are only required when configuring and using models from the corresponding providers.
0 commit comments