Skip to content

Commit 589259a

Browse files
Merge pull request #44 from thefirehacker/3.5.0_Playground
3.5.0 playground
2 parents 6114489 + 99840fe commit 589259a

File tree

9 files changed

+1683
-46
lines changed

9 files changed

+1683
-46
lines changed

.dockerignore

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
# Version control
2+
.git
3+
.gitignore
4+
.gitattributes
5+
6+
# Docker files
7+
Dockerfile
8+
docker-compose.yml
9+
.dockerignore
10+
11+
# Documentation
12+
README.md
13+
*.md
14+
docs/
15+
doc/
16+
17+
# Node.js
18+
node_modules/
19+
npm-debug.log*
20+
yarn-debug.log*
21+
yarn-error.log*
22+
23+
# Environment files
24+
.env
25+
.env.local
26+
.env.development.local
27+
.env.test.local
28+
.env.production.local
29+
30+
# IDE and editor files
31+
.vscode/
32+
.idea/
33+
*.swp
34+
*.swo
35+
*~
36+
37+
# OS generated files
38+
.DS_Store
39+
.DS_Store?
40+
._*
41+
.Spotlight-V100
42+
.Trashes
43+
ehthumbs.db
44+
Thumbs.db
45+
46+
# Logs
47+
logs
48+
*.log
49+
50+
# Runtime data
51+
pids
52+
*.pid
53+
*.seed
54+
*.pid.lock
55+
56+
# Coverage directory used by tools like istanbul
57+
coverage/
58+
59+
# Dependency directories
60+
jspm_packages/
61+
62+
# Optional npm cache directory
63+
.npm
64+
65+
# Optional REPL history
66+
.node_repl_history
67+
68+
# Output of 'npm pack'
69+
*.tgz
70+
71+
# Yarn Integrity file
72+
.yarn-integrity
73+
74+
# Build outputs
75+
dist/
76+
build/
77+
78+
# Temporary files
79+
tmp/
80+
temp/

DOCKER.md

Lines changed: 309 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,309 @@
1+
# 🐳 Docker Support for TimeCapsule-SLM
2+
3+
Run TimeCapsule-SLM in a containerized environment with full AI support including Ollama integration.
4+
5+
## 🚀 Quick Start
6+
7+
### Option 1: Docker Run (Simple)
8+
```bash
9+
# Pull and run the latest image
10+
docker run -d -p 3000:80 --name timecapsule-slm timecapsule-slm:latest
11+
12+
# Access at http://localhost:3000
13+
```
14+
15+
### Option 2: Docker Compose (Recommended)
16+
```bash
17+
# Clone the repository
18+
git clone https://github.com/thefirehacker/TimeCapsule-SLM.git
19+
cd TimeCapsule-SLM
20+
21+
# Start TimeCapsule-SLM only
22+
docker-compose up -d
23+
24+
# Start with Ollama AI support
25+
docker-compose --profile ai-enabled up -d
26+
27+
# Access at http://localhost:3000
28+
```
29+
30+
---
31+
32+
## 📋 Prerequisites
33+
34+
- **Docker**: Version 20.10 or higher
35+
- **Docker Compose**: Version 2.0 or higher
36+
- **RAM**: 4GB+ (8GB+ recommended with AI services)
37+
- **Storage**: 2GB+ free space (10GB+ with AI models)
38+
39+
---
40+
41+
## 🛠️ Build from Source
42+
43+
```bash
44+
# Clone the repository
45+
git clone https://github.com/thefirehacker/TimeCapsule-SLM.git
46+
cd TimeCapsule-SLM
47+
48+
# Build the Docker image
49+
docker build -t timecapsule-slm:latest .
50+
51+
# Run the container
52+
docker run -d -p 3000:80 --name timecapsule-slm timecapsule-slm:latest
53+
```
54+
55+
---
56+
57+
## 🤖 AI Integration Options
58+
59+
### Option 1: With Ollama (Local AI)
60+
```bash
61+
# Start TimeCapsule-SLM with Ollama
62+
docker-compose --profile ai-enabled up -d
63+
64+
# Pull a model in the Ollama container
65+
docker exec timecapsule-ollama ollama pull qwen2.5:0.5b
66+
67+
# Verify Ollama is running
68+
curl http://localhost:11434/api/version
69+
```
70+
71+
### Option 2: External AI Services
72+
```bash
73+
# Start without AI containers (use external Ollama/LM Studio)
74+
docker-compose up -d timecapsule-slm
75+
76+
# Configure external AI in TimeCapsule-SLM UI:
77+
# - Ollama: http://host.docker.internal:11434
78+
# - LM Studio: http://host.docker.internal:1234
79+
# - OpenAI API: Enter your API key
80+
```
81+
82+
---
83+
84+
## ⚙️ Configuration
85+
86+
### Environment Variables
87+
Create a `.env` file for customization:
88+
89+
```bash
90+
# .env
91+
TIMECAPSULE_PORT=3000
92+
OLLAMA_PORT=11434
93+
OLLAMA_ORIGINS=http://localhost:3000
94+
RESTART_POLICY=unless-stopped
95+
```
96+
97+
### Custom docker-compose Override
98+
```yaml
99+
# docker-compose.override.yml
100+
version: '3.8'
101+
services:
102+
timecapsule-slm:
103+
ports:
104+
- "8080:80" # Custom port
105+
environment:
106+
- CUSTOM_VAR=value
107+
```
108+
109+
---
110+
111+
## 📊 Service Management
112+
113+
### Basic Commands
114+
```bash
115+
# Start services
116+
docker-compose up -d
117+
118+
# Stop services
119+
docker-compose down
120+
121+
# View logs
122+
docker-compose logs -f
123+
124+
# Restart services
125+
docker-compose restart
126+
127+
# Update to latest
128+
docker-compose pull && docker-compose up -d
129+
```
130+
131+
### Health Checks
132+
```bash
133+
# Check service status
134+
docker-compose ps
135+
136+
# Check container health
137+
docker inspect --format='{{.State.Health.Status}}' timecapsule-slm
138+
139+
# View health check logs
140+
docker inspect --format='{{range .State.Health.Log}}{{.Output}}{{end}}' timecapsule-slm
141+
```
142+
143+
---
144+
145+
## 💾 Data Persistence
146+
147+
### Ollama Models
148+
Models are automatically persisted in Docker volumes:
149+
```bash
150+
# List Ollama models
151+
docker exec timecapsule-ollama ollama list
152+
153+
# Backup Ollama data
154+
docker run --rm -v timecapsule-slm_ollama_data:/data -v $(pwd):/backup alpine tar czf /backup/ollama-backup.tar.gz -C /data .
155+
156+
# Restore Ollama data
157+
docker run --rm -v timecapsule-slm_ollama_data:/data -v $(pwd):/backup alpine tar xzf /backup/ollama-backup.tar.gz -C /data
158+
```
159+
160+
### User Data
161+
TimeCapsule-SLM stores data in browser localStorage. For enterprise use, consider:
162+
- External database integration
163+
- Shared volume mounts
164+
- Network storage solutions
165+
166+
---
167+
168+
## 🌐 Production Deployment
169+
170+
### Reverse Proxy (Nginx)
171+
```nginx
172+
# nginx.conf
173+
server {
174+
listen 80;
175+
server_name timecapsule.yourdomain.com;
176+
177+
location / {
178+
proxy_pass http://localhost:3000;
179+
proxy_set_header Host $host;
180+
proxy_set_header X-Real-IP $remote_addr;
181+
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
182+
proxy_set_header X-Forwarded-Proto $scheme;
183+
}
184+
}
185+
```
186+
187+
### Traefik Labels (Already included)
188+
```yaml
189+
labels:
190+
- "traefik.enable=true"
191+
- "traefik.http.routers.timecapsule.rule=Host(`timecapsule.yourdomain.com`)"
192+
- "traefik.http.services.timecapsule.loadbalancer.server.port=80"
193+
```
194+
195+
### Docker Swarm
196+
```bash
197+
# Deploy to swarm
198+
docker stack deploy -c docker-compose.yml timecapsule-stack
199+
200+
# Scale services
201+
docker service scale timecapsule-stack_timecapsule-slm=3
202+
```
203+
204+
---
205+
206+
## 🔧 Troubleshooting
207+
208+
### Common Issues
209+
210+
**Port Already in Use**
211+
```bash
212+
# Check what's using the port
213+
sudo lsof -i :3000
214+
215+
# Use different port
216+
docker-compose up -d -p 8080:80
217+
```
218+
219+
**Ollama Connection Failed**
220+
```bash
221+
# Check Ollama service
222+
docker-compose logs ollama
223+
224+
# Test Ollama API
225+
curl http://localhost:11434/api/version
226+
227+
# Restart Ollama
228+
docker-compose restart ollama
229+
```
230+
231+
**CORS Issues**
232+
```bash
233+
# Verify OLLAMA_ORIGINS environment
234+
docker-compose exec ollama env | grep OLLAMA_ORIGINS
235+
236+
# Update origins in docker-compose.yml
237+
environment:
238+
- OLLAMA_ORIGINS=http://localhost:3000,https://yourdomain.com
239+
```
240+
241+
### Logs and Debugging
242+
```bash
243+
# View all logs
244+
docker-compose logs
245+
246+
# Follow specific service logs
247+
docker-compose logs -f timecapsule-slm
248+
249+
# Debug container
250+
docker-compose exec timecapsule-slm /bin/sh
251+
252+
# Check container resource usage
253+
docker stats
254+
```
255+
256+
---
257+
258+
## 🚀 Performance Optimization
259+
260+
### Resource Limits
261+
```yaml
262+
# docker-compose.yml
263+
services:
264+
timecapsule-slm:
265+
deploy:
266+
resources:
267+
limits:
268+
memory: 512M
269+
cpus: '0.5'
270+
reservations:
271+
memory: 256M
272+
cpus: '0.25'
273+
```
274+
275+
### Caching
276+
```dockerfile
277+
# Multi-stage build for better caching
278+
FROM nginx:alpine AS base
279+
# ... optimization steps
280+
```
281+
282+
---
283+
284+
## 📚 Additional Resources
285+
286+
- **Docker Hub**: [timecapsule-slm](https://hub.docker.com/r/firehacker/timecapsule-slm)
287+
- **GitHub**: [TimeCapsule-SLM](https://github.com/thefirehacker/TimeCapsule-SLM)
288+
- **Documentation**: [Main README](README.md)
289+
- **Issues**: [GitHub Issues](https://github.com/thefirehacker/TimeCapsule-SLM/issues)
290+
291+
---
292+
293+
## 💡 Tips for Success
294+
295+
1. **Start Simple**: Use `docker-compose up -d` first
296+
2. **Add AI Gradually**: Enable Ollama with `--profile ai-enabled`
297+
3. **Monitor Resources**: Use `docker stats` to monitor usage
298+
4. **Backup Data**: Regular backups of Ollama models and user data
299+
5. **Update Regularly**: Pull latest images for security updates
300+
301+
---
302+
303+
## 💬 **Need Help?**
304+
305+
🎧 **Discord Community**: [discord.gg/ExQ8fCv9](https://discord.gg/ExQ8fCv9) - Get real-time help with Docker setup
306+
📧 **Email Support**: [[email protected]](mailto:[email protected]) - Technical support and questions
307+
🐛 **Report Issues**: [GitHub Issues](https://github.com/thefirehacker/TimeCapsule-SLM/issues) - Bug reports and feature requests
308+
309+
*Our community is here to help you get TimeCapsule-SLM running smoothly!*

0 commit comments

Comments
 (0)