Skip to content

Commit 5b9f3d3

Browse files
authored
feat(docs): added additional self-hosting documentation (#2237)
* feat(docs): added additional self-hosting documentation * added more
1 parent 05022e3 commit 5b9f3d3

File tree

9 files changed

+724
-1
lines changed

9 files changed

+724
-1
lines changed

README.md

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -89,6 +89,36 @@ Wait for the model to download, then visit [http://localhost:3000](http://localh
8989
docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b
9090
```
9191

92+
#### Using an External Ollama Instance
93+
94+
If you already have Ollama running on your host machine (outside Docker), you need to configure the `OLLAMA_URL` to use `host.docker.internal` instead of `localhost`:
95+
96+
```bash
97+
# Docker Desktop (macOS/Windows)
98+
OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d
99+
100+
# Linux (add extra_hosts or use host IP)
101+
docker compose -f docker-compose.prod.yml up -d # Then set OLLAMA_URL to your host's IP
102+
```
103+
104+
**Why?** When running inside Docker, `localhost` refers to the container itself, not your host machine. `host.docker.internal` is a special DNS name that resolves to the host.
105+
106+
For Linux users, you can either:
107+
- Use your host machine's actual IP address (e.g., `http://192.168.1.100:11434`)
108+
- Add `extra_hosts: ["host.docker.internal:host-gateway"]` to the simstudio service in your compose file
109+
110+
#### Using vLLM
111+
112+
Sim also supports [vLLM](https://docs.vllm.ai/) for self-hosted models with OpenAI-compatible API:
113+
114+
```bash
115+
# Set these environment variables
116+
VLLM_BASE_URL=http://your-vllm-server:8000
117+
VLLM_API_KEY=your_optional_api_key # Only if your vLLM instance requires auth
118+
```
119+
120+
When running with Docker, use `host.docker.internal` if vLLM is on your host machine (same as Ollama above).
121+
92122
### Self-hosted: Dev Containers
93123

94124
1. Open VS Code with the [Remote - Containers extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers)
@@ -190,6 +220,46 @@ Copilot is a Sim-managed service. To use Copilot on a self-hosted instance:
190220
- Go to https://sim.ai → Settings → Copilot and generate a Copilot API key
191221
- Set `COPILOT_API_KEY` environment variable in your self-hosted apps/sim/.env file to that value
192222

223+
## Environment Variables
224+
225+
Key environment variables for self-hosted deployments (see `apps/sim/.env.example` for full list):
226+
227+
| Variable | Required | Description |
228+
|----------|----------|-------------|
229+
| `DATABASE_URL` | Yes | PostgreSQL connection string with pgvector |
230+
| `BETTER_AUTH_SECRET` | Yes | Auth secret (`openssl rand -hex 32`) |
231+
| `BETTER_AUTH_URL` | Yes | Your app URL (e.g., `http://localhost:3000`) |
232+
| `NEXT_PUBLIC_APP_URL` | Yes | Public app URL (same as above) |
233+
| `ENCRYPTION_KEY` | Yes | Encryption key (`openssl rand -hex 32`) |
234+
| `OLLAMA_URL` | No | Ollama server URL (default: `http://localhost:11434`) |
235+
| `VLLM_BASE_URL` | No | vLLM server URL for self-hosted models |
236+
| `COPILOT_API_KEY` | No | API key from sim.ai for Copilot features |
237+
238+
## Troubleshooting
239+
240+
### Ollama models not showing in dropdown (Docker)
241+
242+
If you're running Ollama on your host machine and Sim in Docker, change `OLLAMA_URL` from `localhost` to `host.docker.internal`:
243+
244+
```bash
245+
OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d
246+
```
247+
248+
See [Using an External Ollama Instance](#using-an-external-ollama-instance) for details.
249+
250+
### Database connection issues
251+
252+
Ensure PostgreSQL has the pgvector extension installed. When using Docker, wait for the database to be healthy before running migrations.
253+
254+
### Port conflicts
255+
256+
If ports 3000, 3002, or 5432 are in use, configure alternatives:
257+
258+
```bash
259+
# Custom ports
260+
NEXT_PUBLIC_APP_URL=http://localhost:3100 POSTGRES_PORT=5433 docker compose up -d
261+
```
262+
193263
## Tech Stack
194264

195265
- **Framework**: [Next.js](https://nextjs.org/) (App Router)

apps/docs/content/docs/en/meta.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,8 @@
1313
"variables",
1414
"execution",
1515
"permissions",
16-
"sdks"
16+
"sdks",
17+
"self-hosting"
1718
],
1819
"defaultOpen": false
1920
}
Lines changed: 150 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,150 @@
1+
---
2+
title: Docker
3+
description: Deploy Sim Studio with Docker Compose
4+
---
5+
6+
import { Tab, Tabs } from 'fumadocs-ui/components/tabs'
7+
import { Callout } from 'fumadocs-ui/components/callout'
8+
9+
## Quick Start
10+
11+
```bash
12+
# Clone and start
13+
git clone https://github.com/simstudioai/sim.git && cd sim
14+
docker compose -f docker-compose.prod.yml up -d
15+
```
16+
17+
Open [http://localhost:3000](http://localhost:3000)
18+
19+
## Production Setup
20+
21+
### 1. Configure Environment
22+
23+
```bash
24+
# Generate secrets
25+
cat > .env << EOF
26+
DATABASE_URL=postgresql://postgres:postgres@db:5432/simstudio
27+
BETTER_AUTH_SECRET=$(openssl rand -hex 32)
28+
ENCRYPTION_KEY=$(openssl rand -hex 32)
29+
INTERNAL_API_SECRET=$(openssl rand -hex 32)
30+
NEXT_PUBLIC_APP_URL=https://sim.yourdomain.com
31+
BETTER_AUTH_URL=https://sim.yourdomain.com
32+
NEXT_PUBLIC_SOCKET_URL=https://sim.yourdomain.com
33+
EOF
34+
```
35+
36+
### 2. Start Services
37+
38+
```bash
39+
docker compose -f docker-compose.prod.yml up -d
40+
```
41+
42+
### 3. Set Up SSL
43+
44+
<Tabs items={['Caddy (Recommended)', 'Nginx + Certbot']}>
45+
<Tab value="Caddy (Recommended)">
46+
Caddy automatically handles SSL certificates.
47+
48+
```bash
49+
# Install Caddy
50+
sudo apt install -y debian-keyring debian-archive-keyring apt-transport-https curl
51+
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' | sudo gpg --dearmor -o /usr/share/keyrings/caddy-stable-archive-keyring.gpg
52+
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' | sudo tee /etc/apt/sources.list.d/caddy-stable.list
53+
sudo apt update && sudo apt install caddy
54+
```
55+
56+
Create `/etc/caddy/Caddyfile`:
57+
```
58+
sim.yourdomain.com {
59+
reverse_proxy localhost:3000
60+
61+
handle /socket.io/* {
62+
reverse_proxy localhost:3002
63+
}
64+
}
65+
```
66+
67+
```bash
68+
sudo systemctl restart caddy
69+
```
70+
</Tab>
71+
<Tab value="Nginx + Certbot">
72+
```bash
73+
# Install
74+
sudo apt install nginx certbot python3-certbot-nginx -y
75+
76+
# Create /etc/nginx/sites-available/sim
77+
server {
78+
listen 80;
79+
server_name sim.yourdomain.com;
80+
81+
location / {
82+
proxy_pass http://127.0.0.1:3000;
83+
proxy_http_version 1.1;
84+
proxy_set_header Upgrade $http_upgrade;
85+
proxy_set_header Connection 'upgrade';
86+
proxy_set_header Host $host;
87+
proxy_set_header X-Forwarded-Proto $scheme;
88+
}
89+
90+
location /socket.io/ {
91+
proxy_pass http://127.0.0.1:3002;
92+
proxy_http_version 1.1;
93+
proxy_set_header Upgrade $http_upgrade;
94+
proxy_set_header Connection "upgrade";
95+
}
96+
}
97+
98+
# Enable and get certificate
99+
sudo ln -s /etc/nginx/sites-available/sim /etc/nginx/sites-enabled/
100+
sudo certbot --nginx -d sim.yourdomain.com
101+
```
102+
</Tab>
103+
</Tabs>
104+
105+
## Ollama
106+
107+
```bash
108+
# With GPU
109+
docker compose -f docker-compose.ollama.yml --profile gpu --profile setup up -d
110+
111+
# CPU only
112+
docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d
113+
```
114+
115+
Pull additional models:
116+
```bash
117+
docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.2
118+
```
119+
120+
### External Ollama
121+
122+
If Ollama runs on your host machine (not in Docker):
123+
124+
```bash
125+
# macOS/Windows
126+
OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d
127+
128+
# Linux - use your host IP
129+
OLLAMA_URL=http://192.168.1.100:11434 docker compose -f docker-compose.prod.yml up -d
130+
```
131+
132+
<Callout type="warning">
133+
Inside Docker, `localhost` refers to the container, not your host. Use `host.docker.internal` or your host's IP.
134+
</Callout>
135+
136+
## Commands
137+
138+
```bash
139+
# View logs
140+
docker compose -f docker-compose.prod.yml logs -f simstudio
141+
142+
# Stop
143+
docker compose -f docker-compose.prod.yml down
144+
145+
# Update
146+
docker compose -f docker-compose.prod.yml pull && docker compose -f docker-compose.prod.yml up -d
147+
148+
# Backup database
149+
docker compose -f docker-compose.prod.yml exec db pg_dump -U postgres simstudio > backup.sql
150+
```
Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
---
2+
title: Environment Variables
3+
description: Configuration reference for Sim Studio
4+
---
5+
6+
import { Callout } from 'fumadocs-ui/components/callout'
7+
8+
## Required
9+
10+
| Variable | Description |
11+
|----------|-------------|
12+
| `DATABASE_URL` | PostgreSQL connection string |
13+
| `BETTER_AUTH_SECRET` | Auth secret (32 hex chars): `openssl rand -hex 32` |
14+
| `BETTER_AUTH_URL` | Your app URL |
15+
| `ENCRYPTION_KEY` | Encryption key (32 hex chars): `openssl rand -hex 32` |
16+
| `INTERNAL_API_SECRET` | Internal API secret (32 hex chars): `openssl rand -hex 32` |
17+
| `NEXT_PUBLIC_APP_URL` | Public app URL |
18+
| `NEXT_PUBLIC_SOCKET_URL` | WebSocket URL (default: `http://localhost:3002`) |
19+
20+
## AI Providers
21+
22+
| Variable | Provider |
23+
|----------|----------|
24+
| `OPENAI_API_KEY` | OpenAI |
25+
| `ANTHROPIC_API_KEY_1` | Anthropic Claude |
26+
| `GEMINI_API_KEY_1` | Google Gemini |
27+
| `MISTRAL_API_KEY` | Mistral |
28+
| `OLLAMA_URL` | Ollama (default: `http://localhost:11434`) |
29+
30+
<Callout type="info">
31+
For load balancing, add multiple keys with `_1`, `_2`, `_3` suffixes (e.g., `OPENAI_API_KEY_1`, `OPENAI_API_KEY_2`). Works with OpenAI, Anthropic, and Gemini.
32+
</Callout>
33+
34+
<Callout type="info">
35+
In Docker, use `OLLAMA_URL=http://host.docker.internal:11434` for host-machine Ollama.
36+
</Callout>
37+
38+
### Azure OpenAI
39+
40+
| Variable | Description |
41+
|----------|-------------|
42+
| `AZURE_OPENAI_API_KEY` | Azure OpenAI API key |
43+
| `AZURE_OPENAI_ENDPOINT` | Azure OpenAI endpoint URL |
44+
| `AZURE_OPENAI_API_VERSION` | API version (e.g., `2024-02-15-preview`) |
45+
46+
### vLLM (Self-Hosted)
47+
48+
| Variable | Description |
49+
|----------|-------------|
50+
| `VLLM_BASE_URL` | vLLM server URL (e.g., `http://localhost:8000/v1`) |
51+
| `VLLM_API_KEY` | Optional bearer token for vLLM |
52+
53+
## OAuth Providers
54+
55+
| Variable | Description |
56+
|----------|-------------|
57+
| `GOOGLE_CLIENT_ID` | Google OAuth client ID |
58+
| `GOOGLE_CLIENT_SECRET` | Google OAuth client secret |
59+
| `GITHUB_CLIENT_ID` | GitHub OAuth client ID |
60+
| `GITHUB_CLIENT_SECRET` | GitHub OAuth client secret |
61+
62+
## Optional
63+
64+
| Variable | Description |
65+
|----------|-------------|
66+
| `API_ENCRYPTION_KEY` | Encrypts stored API keys (32 hex chars): `openssl rand -hex 32` |
67+
| `COPILOT_API_KEY` | API key for copilot features |
68+
| `ADMIN_API_KEY` | Admin API key for GitOps operations |
69+
| `RESEND_API_KEY` | Email service for notifications |
70+
| `ALLOWED_LOGIN_DOMAINS` | Restrict signups to domains (comma-separated) |
71+
| `ALLOWED_LOGIN_EMAILS` | Restrict signups to specific emails (comma-separated) |
72+
| `DISABLE_REGISTRATION` | Set to `true` to disable new user signups |
73+
74+
## Example .env
75+
76+
```bash
77+
DATABASE_URL=postgresql://postgres:postgres@db:5432/simstudio
78+
BETTER_AUTH_SECRET=<openssl rand -hex 32>
79+
BETTER_AUTH_URL=https://sim.yourdomain.com
80+
ENCRYPTION_KEY=<openssl rand -hex 32>
81+
INTERNAL_API_SECRET=<openssl rand -hex 32>
82+
NEXT_PUBLIC_APP_URL=https://sim.yourdomain.com
83+
NEXT_PUBLIC_SOCKET_URL=https://sim.yourdomain.com
84+
OPENAI_API_KEY=sk-...
85+
```
86+
87+
See `apps/sim/.env.example` for all options.
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
---
2+
title: Self-Hosting
3+
description: Deploy Sim Studio on your own infrastructure
4+
---
5+
6+
import { Card, Cards } from 'fumadocs-ui/components/card'
7+
import { Callout } from 'fumadocs-ui/components/callout'
8+
9+
Deploy Sim Studio on your own infrastructure with Docker or Kubernetes.
10+
11+
## Requirements
12+
13+
| Resource | Minimum | Recommended |
14+
|----------|---------|-------------|
15+
| CPU | 2 cores | 4+ cores |
16+
| RAM | 12 GB | 16+ GB |
17+
| Storage | 20 GB SSD | 50+ GB SSD |
18+
| Docker | 20.10+ | Latest |
19+
20+
## Quick Start
21+
22+
```bash
23+
git clone https://github.com/simstudioai/sim.git && cd sim
24+
docker compose -f docker-compose.prod.yml up -d
25+
```
26+
27+
Open [http://localhost:3000](http://localhost:3000)
28+
29+
## Deployment Options
30+
31+
<Cards>
32+
<Card title="Docker" href="/self-hosting/docker">
33+
Deploy with Docker Compose on any server
34+
</Card>
35+
<Card title="Kubernetes" href="/self-hosting/kubernetes">
36+
Deploy with Helm on Kubernetes clusters
37+
</Card>
38+
<Card title="Cloud Platforms" href="/self-hosting/platforms">
39+
Railway, DigitalOcean, AWS, Azure, GCP guides
40+
</Card>
41+
</Cards>
42+
43+
## Architecture
44+
45+
| Component | Port | Description |
46+
|-----------|------|-------------|
47+
| simstudio | 3000 | Main application |
48+
| realtime | 3002 | WebSocket server |
49+
| db | 5432 | PostgreSQL with pgvector |
50+
| migrations | - | Database migrations (runs once) |

0 commit comments

Comments
 (0)