Skip to content

Commit 2f4351e

Browse files
authored
doc: add env instructions for Ollama if not local (#141)
* add env instructions for Ollama if not local * add OLLAMA_HOST instructions to api Readme * simplify examples
1 parent acfdc27 commit 2f4351e

File tree

3 files changed

+25
-7
lines changed

3 files changed

+25
-7
lines changed

Ollama-instruction.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,8 @@ Create a `.env` file in the project root:
4747
```
4848
# No need for API keys when using Ollama locally
4949
PORT=8001
50+
# Optionally, provide OLLAMA_HOST if Ollama is not local
51+
OLLAMA_HOST=your_ollama_host # (default: http://localhost:11434)
5052
```
5153

5254
Start the backend:
@@ -78,11 +80,13 @@ npm run dev
7880
# For regular use
7981
docker run -p 3000:3000 -p 8001:8001 --name deepwiki \
8082
-v ~/.adalflow:/root/.adalflow \
83+
-e OLLAMA_HOST=your_ollama_host \
8184
deepwiki:ollama-local
8285

8386
# For local repository analysis
8487
docker run -p 3000:3000 -p 8001:8001 --name deepwiki \
8588
-v ~/.adalflow:/root/.adalflow \
89+
-e OLLAMA_HOST=your_ollama_host \
8690
-v /path/to/your/repo:/app/local-repos/repo-name \
8791
deepwiki:ollama-local
8892
```

README.md

Lines changed: 18 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,8 @@ echo "GOOGLE_API_KEY=your_google_api_key" > .env
4141
echo "OPENAI_API_KEY=your_openai_api_key" >> .env
4242
# Optional: Add OpenRouter API key if you want to use OpenRouter models
4343
echo "OPENROUTER_API_KEY=your_openrouter_api_key" >> .env
44+
# Optional: Add Ollama host if not local. defaults to http://localhost:11434
45+
echo "OLLAMA_HOST=your_ollama_host" >> .env
4446

4547
# Run with Docker Compose
4648
docker-compose up
@@ -63,6 +65,8 @@ GOOGLE_API_KEY=your_google_api_key
6365
OPENAI_API_KEY=your_openai_api_key
6466
# Optional: Add this if you want to use OpenRouter models
6567
OPENROUTER_API_KEY=your_openrouter_api_key
68+
# Optional: Add Ollama host if not local. default: http://localhost:11434
69+
OLLAMA_HOST=your_ollama_host
6670
```
6771

6872
#### Step 2: Start the Backend
@@ -190,6 +194,9 @@ OPENROUTER_API_KEY=your_openrouter_api_key # Required for OpenRouter models
190194
# OpenAI API Base URL Configuration
191195
OPENAI_BASE_URL=https://custom-api-endpoint.com/v1 # Optional, for custom OpenAI API endpoints
192196
197+
# Ollama host
198+
OLLAMA_HOST=your_ollama_host # Optional, if Ollama is not local. default: http://localhost:11434
199+
193200
# Configuration Directory
194201
DEEPWIKI_CONFIG_DIR=/path/to/custom/config/dir # Optional, for custom config file location
195202
```
@@ -238,13 +245,14 @@ The OpenAI Client's base_url configuration is designed primarily for enterprise
238245

239246
### Environment Variables
240247

241-
| Variable | Description | Required | Note |
242-
|----------|-------------|----------|------|
243-
| `GOOGLE_API_KEY` | Google Gemini API key for AI generation | No | Required only if you want to use Google Gemini models
244-
| `OPENAI_API_KEY` | OpenAI API key for embeddings | Yes | Note: This is required even if you're not using OpenAI models, as it's used for embeddings. |
245-
| `OPENROUTER_API_KEY` | OpenRouter API key for alternative models | No | Required only if you want to use OpenRouter models |
246-
| `PORT` | Port for the API server (default: 8001) | No | If you host API and frontend on the same machine, make sure change port of `SERVER_BASE_URL` accordingly |
247-
| `SERVER_BASE_URL` | Base URL for the API server (default: http://localhost:8001) | No |
248+
| Variable | Description | Required | Note |
249+
|----------------------|--------------------------------------------------------------|----------|----------------------------------------------------------------------------------------------------------|
250+
| `GOOGLE_API_KEY` | Google Gemini API key for AI generation | No | Required only if you want to use Google Gemini models
251+
| `OPENAI_API_KEY` | OpenAI API key for embeddings | Yes | Note: This is required even if you're not using OpenAI models, as it's used for embeddings. |
252+
| `OPENROUTER_API_KEY` | OpenRouter API key for alternative models | No | Required only if you want to use OpenRouter models |
253+
| `OLLAMA_HOST` | Ollama Host (default: http://localhost:11434) | No | Required only if you want to use external Ollama server |
254+
| `PORT` | Port for the API server (default: 8001) | No | If you host API and frontend on the same machine, make sure change port of `SERVER_BASE_URL` accordingly |
255+
| `SERVER_BASE_URL` | Base URL for the API server (default: http://localhost:8001) | No |
248256

249257
If you're not using ollama mode, you need to configure an OpenAI API key for embeddings. Other API keys are only required when configuring and using models from the corresponding providers.
250258

@@ -261,6 +269,7 @@ docker run -p 8001:8001 -p 3000:3000 \
261269
-e GOOGLE_API_KEY=your_google_api_key \
262270
-e OPENAI_API_KEY=your_openai_api_key \
263271
-e OPENROUTER_API_KEY=your_openrouter_api_key \
272+
-e OLLAMA_HOST=your_ollama_host \
264273
-v ~/.adalflow:/root/.adalflow \
265274
ghcr.io/asyncfuncai/deepwiki-open:latest
266275
```
@@ -290,6 +299,7 @@ You can also mount a .env file to the container:
290299
echo "GOOGLE_API_KEY=your_google_api_key" > .env
291300
echo "OPENAI_API_KEY=your_openai_api_key" >> .env
292301
echo "OPENROUTER_API_KEY=your_openrouter_api_key" >> .env
302+
echo "OLLAMA_HOST=your_ollama_host" >> .env
293303

294304
# Run the container with the .env file mounted
295305
docker run -p 8001:8001 -p 3000:3000 \
@@ -322,6 +332,7 @@ docker run -p 8001:8001 -p 3000:3000 \
322332
-e GOOGLE_API_KEY=your_google_api_key \
323333
-e OPENAI_API_KEY=your_openai_api_key \
324334
-e OPENROUTER_API_KEY=your_openrouter_api_key \
335+
-e OLLAMA_HOST=your_ollama_host \
325336
deepwiki-open
326337
```
327338

api/README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,9 @@ OPENROUTER_API_KEY=your_openrouter_api_key # Required only if using OpenRouter
3434
# OpenAI API Configuration
3535
OPENAI_BASE_URL=https://custom-api-endpoint.com/v1 # Optional, for custom OpenAI API endpoints
3636
37+
# Ollama host
38+
OLLAMA_HOST=https://your_ollama_host" # Optional: Add Ollama host if not local. default: http://localhost:11434
39+
3740
# Server Configuration
3841
PORT=8001 # Optional, defaults to 8001
3942
```

0 commit comments

Comments
 (0)