Skip to content

Commit caebf7d

Browse files
authored
Merge pull request #38 from LikiosSedo/docs/mintlify-site
docs: add Mintlify documentation site
2 parents 2d5d427 + 9c16ba3 commit caebf7d

31 files changed

+1677
-6
lines changed

docs/assets/favicon.svg

Lines changed: 4 additions & 0 deletions
Loading

docs/assets/logo-dark.svg

Lines changed: 5 additions & 0 deletions
Loading

docs/assets/logo-light.svg

Lines changed: 5 additions & 0 deletions
Loading

docs/configuration/mcp.mdx

Lines changed: 175 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,175 @@
1+
---
2+
title: "MCP Servers"
3+
sidebarTitle: "MCP Servers"
4+
description: "Connect external data sources to Siclaw via Model Context Protocol."
5+
---
6+
7+
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/) lets you extend Siclaw with external tools and data sources. During investigation, the agent discovers and uses MCP tools automatically — querying Prometheus metrics, searching GitHub issues, reading files, or calling any custom API.
8+
9+
## Supported Transports
10+
11+
| Transport | Use Case | Config |
12+
|-----------|----------|--------|
13+
| **stdio** | Local processes (npx packages, Python scripts) | `command` + `args` |
14+
| **SSE** | Remote servers (Server-Sent Events) | `url` |
15+
| **streamable-http** | Remote servers (bidirectional HTTP) | `url` |
16+
17+
## Configuration
18+
19+
### Via Web UI (Recommended)
20+
21+
In Gateway mode, go to **Settings** > **MCP Servers**:
22+
23+
1. Click **New Server**
24+
2. Select transport type
25+
3. Enter a unique name and optional description
26+
4. Fill in transport-specific fields (command/args or URL)
27+
5. Add environment variables or HTTP headers as needed
28+
6. Save
29+
30+
Changes take effect immediately — all active sessions reload automatically.
31+
32+
<Note>
33+
Creating and managing MCP servers requires **admin** role. All users can use the tools they provide.
34+
</Note>
35+
36+
### Via settings.json (CLI mode)
37+
38+
For TUI / single-user mode, add MCP servers to `~/.siclaw/config/settings.json`:
39+
40+
```json
41+
{
42+
"mcpServers": {
43+
"prometheus": {
44+
"transport": "stdio",
45+
"command": "npx",
46+
"args": ["-y", "@prom-mcp/server"],
47+
"env": {
48+
"PROMETHEUS_URL": "http://prometheus:9090"
49+
}
50+
}
51+
}
52+
}
53+
```
54+
55+
### Environment Variable Substitution
56+
57+
Use `${VAR_NAME}` in env values — Siclaw resolves them from the process environment at spawn time:
58+
59+
```json
60+
{
61+
"mcpServers": {
62+
"github": {
63+
"transport": "streamable-http",
64+
"url": "http://localhost:8000/mcp",
65+
"headers": {
66+
"Authorization": "Bearer ${GITHUB_TOKEN}"
67+
}
68+
}
69+
}
70+
}
71+
```
72+
73+
## Examples
74+
75+
### Prometheus Metrics
76+
77+
```json
78+
{
79+
"prometheus": {
80+
"transport": "stdio",
81+
"command": "npx",
82+
"args": ["-y", "@prom-mcp/server"],
83+
"env": {
84+
"PROMETHEUS_URL": "http://prometheus.monitoring:9090"
85+
}
86+
}
87+
}
88+
```
89+
90+
The agent can then query metrics during investigation:
91+
92+
```
93+
Phase 1: Context Gathering
94+
[mcp:prometheus] rate(http_request_duration_seconds_sum{service="payment"}[5m])
95+
[kubectl] get pods -n payments
96+
```
97+
98+
### Filesystem Access
99+
100+
```json
101+
{
102+
"filesystem": {
103+
"transport": "stdio",
104+
"command": "npx",
105+
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"]
106+
}
107+
}
108+
```
109+
110+
### GitHub
111+
112+
```json
113+
{
114+
"github": {
115+
"transport": "stdio",
116+
"command": "npx",
117+
"args": ["-y", "@github/mcp-server"],
118+
"env": {
119+
"GITHUB_TOKEN": "${GITHUB_TOKEN}"
120+
}
121+
}
122+
}
123+
```
124+
125+
### Custom HTTP Service
126+
127+
Any service implementing the MCP protocol can be connected:
128+
129+
```json
130+
{
131+
"my-service": {
132+
"transport": "streamable-http",
133+
"url": "https://mcp.internal.example.com/v1",
134+
"headers": {
135+
"Authorization": "Bearer ${MY_SERVICE_TOKEN}"
136+
}
137+
}
138+
}
139+
```
140+
141+
## How It Works
142+
143+
### Tool Discovery
144+
145+
When a session starts, Siclaw connects to all enabled MCP servers and discovers their tools. MCP tools appear alongside built-in tools with a `mcp__` prefix:
146+
147+
```
148+
mcp__prometheus__query ← from Prometheus MCP server
149+
mcp__github__search_issues ← from GitHub MCP server
150+
mcp__filesystem__read_file ← from Filesystem MCP server
151+
```
152+
153+
The agent decides which tools to use based on the investigation context — no manual tool selection needed.
154+
155+
### Config Sync (Gateway Mode)
156+
157+
In multi-user deployments, MCP configuration syncs automatically:
158+
159+
```
160+
Admin creates/edits MCP server in Web UI
161+
→ Saved to database
162+
→ Gateway notifies all active AgentBoxes
163+
→ Each AgentBox fetches merged config
164+
→ Active sessions reload with new tools
165+
```
166+
167+
The merge strategy: DB entries (managed via Web UI) override local seed entries with the same name. Disabling a DB entry removes it from the merged config.
168+
169+
### Kubernetes Considerations
170+
171+
<Warning>
172+
In Kubernetes mode, MCP server binaries must be available in the AgentBox container image. For `stdio` transport servers that use `npx`, ensure the package can be installed inside the pod (network access required) or pre-install it in the image.
173+
</Warning>
174+
175+
For HTTP-based transports (`sse`, `streamable-http`), the MCP server runs externally — the AgentBox pod only needs network access to the URL.

docs/configuration/providers.mdx

Lines changed: 147 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,147 @@
1+
---
2+
title: "LLM Providers"
3+
sidebarTitle: "LLM Providers"
4+
description: "Configure Siclaw to use Anthropic, OpenAI, Ollama, or any compatible LLM."
5+
---
6+
7+
Siclaw needs an LLM to power its investigation engine. Configure via `~/.siclaw/config/settings.json` or environment variable overrides.
8+
9+
## Anthropic (Recommended)
10+
11+
```json
12+
{
13+
"providers": {
14+
"default": {
15+
"baseUrl": "https://api.anthropic.com/v1",
16+
"apiKey": "sk-ant-...",
17+
"api": "anthropic",
18+
"authHeader": true,
19+
"models": [{
20+
"id": "claude-sonnet-4-20250514",
21+
"name": "Claude Sonnet 4",
22+
"contextWindow": 200000,
23+
"maxTokens": 16000
24+
}]
25+
}
26+
}
27+
}
28+
```
29+
30+
Recommended models: `claude-sonnet-4-20250514` (best balance) or `claude-opus-4-20250514` (highest quality).
31+
32+
## OpenAI
33+
34+
```json
35+
{
36+
"providers": {
37+
"default": {
38+
"baseUrl": "https://api.openai.com/v1",
39+
"apiKey": "sk-...",
40+
"api": "openai-completions",
41+
"authHeader": true,
42+
"models": [{
43+
"id": "gpt-4o",
44+
"name": "GPT-4o",
45+
"contextWindow": 128000,
46+
"maxTokens": 16384
47+
}]
48+
}
49+
}
50+
}
51+
```
52+
53+
## OpenAI-Compatible Providers
54+
55+
Any API that implements the OpenAI chat completions format works with Siclaw — Ollama, vLLM, LiteLLM, Azure OpenAI, Moonshot, DeepSeek, and many others.
56+
57+
```json
58+
{
59+
"providers": {
60+
"default": {
61+
"baseUrl": "http://localhost:11434/v1",
62+
"apiKey": "ollama",
63+
"api": "openai-completions",
64+
"authHeader": true,
65+
"models": [{
66+
"id": "llama3.1:70b",
67+
"name": "Llama 3.1 70B",
68+
"contextWindow": 131072,
69+
"maxTokens": 8192
70+
}]
71+
}
72+
}
73+
}
74+
```
75+
76+
### Common Providers
77+
78+
| Provider | `baseUrl` | Notes |
79+
|----------|-----------|-------|
80+
| **Ollama** | `http://localhost:11434/v1` | Local, free. Use 70B+ for best results. |
81+
| **vLLM** | `http://localhost:8000/v1` | Self-hosted GPU inference |
82+
| **Moonshot (Kimi)** | `https://api.moonshot.cn/v1` | `moonshot-v1-128k` |
83+
| **DeepSeek** | `https://api.deepseek.com/v1` | `deepseek-chat` |
84+
| **Qwen (DashScope)** | `https://dashscope.aliyuncs.com/compatible-mode/v1` | `qwen-plus` |
85+
86+
<Tip>
87+
See [`settings.example.json`](https://github.com/scitix/siclaw/blob/main/settings.example.json) for a complete example with all fields.
88+
</Tip>
89+
90+
## Environment Variable Overrides
91+
92+
These override the default provider's settings at runtime (highest priority):
93+
94+
```bash
95+
SICLAW_LLM_API_KEY=sk-... # Override default provider's API key
96+
SICLAW_LLM_BASE_URL=https://... # Override default provider's base URL
97+
SICLAW_LLM_MODEL=gpt-4o # Override default model ID
98+
```
99+
100+
API keys also support `$VAR` / `${VAR}` references in `settings.json`:
101+
102+
```json
103+
{
104+
"providers": {
105+
"default": {
106+
"apiKey": "${MY_API_KEY}",
107+
"baseUrl": "https://api.openai.com/v1",
108+
"api": "openai-completions",
109+
"models": [{ "id": "gpt-4o", "name": "GPT-4o" }]
110+
}
111+
}
112+
}
113+
```
114+
115+
## Embedding Provider
116+
117+
<Warning>
118+
Without an embedding provider, Investigation Memory semantic search is disabled. All other features work normally.
119+
</Warning>
120+
121+
Embedding is used for memory search — matching current symptoms against past investigation records. Any OpenAI-compatible embedding API works:
122+
123+
```json
124+
{
125+
"embedding": {
126+
"baseUrl": "https://api.example.com/v1",
127+
"apiKey": "sk-...",
128+
"model": "bge-m3",
129+
"dimensions": 1024
130+
}
131+
}
132+
```
133+
134+
| Provider | Model | Dimensions | Notes |
135+
|----------|-------|------------|-------|
136+
| **BGE-M3** (recommended) | `bge-m3` | 1024 | Multilingual, good for technical content |
137+
| **OpenAI** | `text-embedding-3-small` | 1536 | Easy setup if you already have an OpenAI key |
138+
| **Ollama** | `nomic-embed-text` | 768 | Local, free |
139+
140+
## Model Recommendations
141+
142+
| Use Case | Recommended | Notes |
143+
|----------|-------------|-------|
144+
| **Production investigations** | Claude Sonnet 4 / GPT-4o | Best quality-to-speed ratio |
145+
| **Complex root cause analysis** | Claude Opus 4 | Highest reasoning capability |
146+
| **Cost-sensitive / air-gapped** | Llama 3.1 70B+ via Ollama | Local, no API costs |
147+
| **Testing / development** | Any available model | Smaller models work for basic checks |

docs/design/decisions.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,9 @@
1+
---
2+
title: "Architecture Decision Records"
3+
sidebarTitle: "ADRs"
4+
description: "Key architectural decisions with context, rationale, and consequences."
5+
---
6+
17
# Architecture Decision Records (ADR)
28

39
> **Format**: Context → Decision → Consequences

0 commit comments

Comments
 (0)