Skip to content

Releases: anthonyfoust/ai-stack-homelab

v2.0.1 - housekeeping

31 Oct 03:20

Choose a tag to compare

Changes:

  • Flattened repository structure: moved all project files to repo root; removed nested ai-stack/ directory.
  • Ignored and removed workspace/editor files from version control: .DS_Store, *.code-workspace, .vscode/, .idea/, .claude/, .clauderc.

Upgrade notes:

  • git pull --tags
  • No runtime changes; only housekeeping and structure updates.

major update - updated scripts and services with more optimized configurations

31 Oct 03:12

Choose a tag to compare

Highlights:

  • Overhauled Docker-based architecture for services.
  • Optimized configurations for traefik, mcp, redis, postgres, searxng, open-webui.
  • Revamped docker-compose.yml with clearer service definitions and env handling.
  • Expanded documentation: README.md, TESTING.md, ARCHITECTURE.md, CHANGELOG.md.
  • New setup/init scripts; legacy scripts removed.

Breaking changes:

  • Legacy scripts and docs removed or replaced.
  • Updated directory structure under configs/ and docs/.

Upgrade notes:

  • git pull --tags
  • docker compose pull && docker compose up -d --remove-orphans
  • Back up Postgres/Redis volumes before upgrading if used.

Note:

  • Follow-up housekeeping in v2.0.1:
    • Flattened repository structure to the repo root (removed nested ai-stack/).
    • Ignored and removed workspace/editor files from version control (.DS_Store, *.code-workspace, .vscode/, .idea/, .claude/, .clauderc).
    • See v2.0.1 release for details.

AI Stack v1.0.0-beta

31 Jul 13:55
95b9773

Choose a tag to compare

AI Stack v1.0.0-beta Pre-release
Pre-release

🤖 AI Stack v1.0.0-beta

⚠️ This is a pre-release version - Currently being tested on Mac Mini M4 hardware.

🔬 Testing Status

  • Initial setup script verification
  • All services start correctly
  • Llama 3.2 model download and inference
  • Backup/restore functionality
  • Resource optimization validation
  • Production security testing

🤝 Want to Help Test?

If you have a Mac Mini M4 and want to help validate this configuration:

  1. Try the setup process
  2. Report any issues via GitHub Issues
  3. Share your hardware specs and experience

Stable v1.0.0 release coming soon!

✨ What's Included

  • n8n Workflow Automation - Visual workflow builder with AI integration
  • Ollama with Llama 3.2 - Local AI models (1B and 3B parameters)
  • Open WebUI - ChatGPT-like interface for local AI
  • LiteLLM Proxy - Unified API for multiple AI providers
  • MCP Integration - Model Context Protocol support
  • PostgreSQL & Redis - Persistent storage and caching
  • Automated Backups - Encrypted backup system with retention
  • Production Security - Network isolation, authentication, encryption

🖥️ Optimized For

  • Mac Mini M4 (2024) with 16GB+ RAM
  • Llama 3.2 Models (1B and 3B parameters)
  • Family-Safe Usage with appropriate content filters
  • Personal Production environments

🚀 Quick Start

  1. Download and extract this release
  2. Copy .env.example to .env and configure passwords
  3. Run ./scripts/setup.sh
  4. Run ./scripts/start.sh
  5. Access your AI at http://localhost:8080

🔧 System Requirements

  • Mac Mini M4 (2024) or similar Apple Silicon Mac
  • macOS 14.0+ (Sonoma)
  • Docker Desktop 4.25+
  • 16GB+ RAM (32GB recommended)
  • 100GB+ free disk space

📚 Documentation

🛡️ Security Features

  • Encrypted backups with AES-256
  • Network segmentation (frontend/backend/ai networks)
  • No privileged containers
  • Password protection on all services
  • Family-safe mode enabled by default

🔄 What's New in v1.0.0

  • Initial release with complete AI stack
  • Mac Mini M4 optimized resource allocation
  • Automated setup and management scripts
  • Production-ready security configuration
  • Comprehensive backup and restore system
  • Full documentation suite

⚠️ Important Notes

  • Update all passwords in .env before first use
  • Backup your .env file securely after configuration
  • Default models will download ~4GB during setup
  • Services start automatically in dependency order

💬 Support

  • Check the troubleshooting guide
  • Review logs with docker compose logs [service_name]
  • For issues, please open a GitHub issue

🎉 Happy AI automation!