-
-
Notifications
You must be signed in to change notification settings - Fork 163
2.3.63 Satellite AstrBot
Handle:
astrbot
URL: http://localhost:34661

AstrBot is an open-source, all-in-one agentic chatbot platform and development framework. It enables you to deploy and develop LLM-powered chatbots across multiple messaging platforms with a comprehensive web-based configuration interface.
Key Features:
- Multi-LLM Support: Integrates with OpenAI, Anthropic, Google Gemini, DeepSeek, Ollama, LM Studio, vLLM, and 10+ other providers
- Multi-Platform: Connect to QQ, Telegram, Discord, Lark, DingTalk, WeChat, WeCom, KOOK, and more
- Agent Capabilities: Multi-turn tool calling, sandboxed code interpreter, web search, and custom tool integration
- Plugin System: Deep plugin mechanism with a thriving community ecosystem
- Knowledge Base: Native knowledge base with RAG support for document-based Q&A
- WebUI: Feature-rich visual configuration and management interface
- LLMOps Integration: Connect to Dify, Coze, and Alibaba Cloud Bailian platforms
- Speech Services: TTS (Text-to-Speech) and STT (Speech-to-Text) support
- Rate Limiting: Built-in rate limiting and whitelisting for production use
# Pull the image
harbor pull astrbot
# Start AstrBot
harbor up astrbot --open- Access the WebUI at http://localhost:34661
- Default credentials:
astrbot/astrbot- You will be prompted to change the password on first login
- You can connect AstrBot to LLM/Embedding backends in Harbor
- Most useful part of the service is to create functional chatbots in Discord, Slack, Telegram, etc with your local LLM and AstrBot as the orchestrator
Following options can be set via harbor config:
# Main WebUI port
HARBOR_ASTRBOT_HOST_PORT 34661
# Container image
HARBOR_ASTRBOT_IMAGE soulter/astrbot
HARBOR_ASTRBOT_VERSION latestAll configuration is done through the WebUI after starting the service:
-
First Login: Navigate to http://localhost:34661 and log in with
astrbot/astrbot - Change Password: Go to Settings → Security to update credentials
- Configure LLM Provider: Navigate to "Service Provider Management" to add your LLM backends
AstrBot persists all configuration and data in:
-
astrbot/data/- Database, configuration, plugins, and uploaded files
AstrBot works seamlessly with Harbor's inference backends. When running alongside Ollama, llama.cpp, or vLLM, you can configure them directly in the WebUI.
When running both AstrBot and Ollama:
harbor up astrbot ollama --openConfiguration in AstrBot WebUI:
- Navigate to Service Provider Management
- Click Add Provider → Select Ollama
- Configure:
-
API Base URL:
http://ollama:11434/v1 -
API Key:
ollama(or leave default)
-
API Base URL:
- Click Save
- Pull models using Harbor:
harbor ollama pull llama3.2:3b - Select the model in AstrBot's chat interface
When running both AstrBot and llama.cpp:
harbor up astrbot llamacpp --openConfiguration in AstrBot WebUI:
- Navigate to Service Provider Management
- Click Add Provider → Select OpenAI (llama.cpp is OpenAI-compatible)
- Configure:
-
Provider Name:
llama.cpp(custom name) -
API Base URL:
http://llamacpp:8080/v1 - API Key: Any value (llama.cpp doesn't require authentication)
-
Provider Name:
- Click Save
- Load a model in llama.cpp:
harbor llamacpp load <model-file> - Select the provider in AstrBot's chat interface
When running both AstrBot and vLLM:
harbor up astrbot vllm --openConfiguration in AstrBot WebUI:
- Navigate to Service Provider Management
- Click Add Provider → Select OpenAI (vLLM is OpenAI-compatible)
- Configure:
-
Provider Name:
vLLM(custom name) -
API Base URL:
http://vllm:8000/v1 - API Key: Any value (vLLM doesn't require authentication)
-
Provider Name:
- Click Save
- vLLM should already have a model loaded
- Select the provider in AstrBot's chat interface
-
Configure LLM Provider (see Backend Integration above)
-
Configure Messaging Platform:
- Navigate to Platform Configuration
- Select your messaging platform (Telegram, Discord, etc.)
- Follow platform-specific setup instructions
- Enable the platform
-
Optional: Configure Persona:
- Navigate to Persona Management
- Create custom personas with system prompts
- Assign personas to different platforms or conversations
-
Optional: Enable Plugins:
- Navigate to Plugin Market
- Browse and install community plugins
- Configure plugin settings as needed
AstrBot includes a built-in knowledge base system:
- Navigate to Knowledge Base in the WebUI
- Create a new knowledge base
- Upload documents (PDF, TXT, Markdown, etc.)
- Configure embedding provider (OpenAI, local embeddings, etc.)
- Enable knowledge base for specific conversations or globally
Configure multiple messaging platforms simultaneously:
- Telegram: Create bot via @BotFather, add token in AstrBot
- Discord: Create application, add bot token
- QQ: Use NapCat or other QQ protocol adapters
- Lark/Feishu: Configure app credentials
- Custom Webhook: Use built-in webhook server for custom integrations
Problem: WebUI not loading at http://localhost:34661
Solutions:
# Check if service is running
harbor ps astrbot
# View logs
harbor logs astrbot
# Restart the service
harbor restart astrbotProblem: Cannot connect to Ollama/llama.cpp/vLLM
Solutions:
- Verify both services are running:
harbor ps - Check network connectivity: Use correct internal URLs (e.g.,
http://ollama:11434/v1) - For Ollama, ensure model is pulled:
harbor ollama pull <model> - Check provider configuration in WebUI matches Harbor's service endpoints
Problem: Plugin download or installation errors
Solutions:
- Check internet connectivity from container
- Verify plugin compatibility with current AstrBot version
- Review plugin logs in Plugin Management → Plugin Details
- Try manual plugin installation by cloning to
astrbot/data/plugins/
Problem: High memory usage or slow responses
Solutions:
- Reduce concurrent conversation limits in Settings
- Enable conversation history pruning
- Disable unused plugins
- Use lighter LLM models
- Configure response streaming for better UX
- Official Documentation
- GitHub Repository
- Community Forum
- Plugin Development Guide
- Ollama Configuration Guide
- Current Version: 4.10.6 (as of Jan 2026)
-
Docker Image:
soulter/astrbot:latest - License: AGPL-v3
- Requires: Python 3.11+, Node.js (included in Docker image)
For more information about Harbor's CLI capabilities, see the CLI Reference.