Releases: anthonyfoust/ai-stack-homelab
Releases · anthonyfoust/ai-stack-homelab
v2.0.1 - housekeeping
Changes:
- Flattened repository structure: moved all project files to repo root; removed nested
ai-stack/directory. - Ignored and removed workspace/editor files from version control:
.DS_Store,*.code-workspace,.vscode/,.idea/,.claude/,.clauderc.
Upgrade notes:
- git pull --tags
- No runtime changes; only housekeeping and structure updates.
major update - updated scripts and services with more optimized configurations
Highlights:
- Overhauled Docker-based architecture for services.
- Optimized configurations for traefik, mcp, redis, postgres, searxng, open-webui.
- Revamped docker-compose.yml with clearer service definitions and env handling.
- Expanded documentation: README.md, TESTING.md, ARCHITECTURE.md, CHANGELOG.md.
- New setup/init scripts; legacy scripts removed.
Breaking changes:
- Legacy scripts and docs removed or replaced.
- Updated directory structure under configs/ and docs/.
Upgrade notes:
- git pull --tags
- docker compose pull && docker compose up -d --remove-orphans
- Back up Postgres/Redis volumes before upgrading if used.
Note:
- Follow-up housekeeping in v2.0.1:
- Flattened repository structure to the repo root (removed nested ai-stack/).
- Ignored and removed workspace/editor files from version control (.DS_Store, *.code-workspace, .vscode/, .idea/, .claude/, .clauderc).
- See v2.0.1 release for details.
AI Stack v1.0.0-beta
🤖 AI Stack v1.0.0-beta
🔬 Testing Status
- Initial setup script verification
- All services start correctly
- Llama 3.2 model download and inference
- Backup/restore functionality
- Resource optimization validation
- Production security testing
🤝 Want to Help Test?
If you have a Mac Mini M4 and want to help validate this configuration:
- Try the setup process
- Report any issues via GitHub Issues
- Share your hardware specs and experience
Stable v1.0.0 release coming soon!
✨ What's Included
- n8n Workflow Automation - Visual workflow builder with AI integration
- Ollama with Llama 3.2 - Local AI models (1B and 3B parameters)
- Open WebUI - ChatGPT-like interface for local AI
- LiteLLM Proxy - Unified API for multiple AI providers
- MCP Integration - Model Context Protocol support
- PostgreSQL & Redis - Persistent storage and caching
- Automated Backups - Encrypted backup system with retention
- Production Security - Network isolation, authentication, encryption
🖥️ Optimized For
- Mac Mini M4 (2024) with 16GB+ RAM
- Llama 3.2 Models (1B and 3B parameters)
- Family-Safe Usage with appropriate content filters
- Personal Production environments
🚀 Quick Start
- Download and extract this release
- Copy
.env.exampleto.envand configure passwords - Run
./scripts/setup.sh - Run
./scripts/start.sh - Access your AI at http://localhost:8080
🔧 System Requirements
- Mac Mini M4 (2024) or similar Apple Silicon Mac
- macOS 14.0+ (Sonoma)
- Docker Desktop 4.25+
- 16GB+ RAM (32GB recommended)
- 100GB+ free disk space
📚 Documentation
🛡️ Security Features
- Encrypted backups with AES-256
- Network segmentation (frontend/backend/ai networks)
- No privileged containers
- Password protection on all services
- Family-safe mode enabled by default
🔄 What's New in v1.0.0
- Initial release with complete AI stack
- Mac Mini M4 optimized resource allocation
- Automated setup and management scripts
- Production-ready security configuration
- Comprehensive backup and restore system
- Full documentation suite
⚠️ Important Notes
- Update all passwords in
.envbefore first use - Backup your
.envfile securely after configuration - Default models will download ~4GB during setup
- Services start automatically in dependency order
💬 Support
- Check the troubleshooting guide
- Review logs with
docker compose logs [service_name] - For issues, please open a GitHub issue
🎉 Happy AI automation!