A Slack bot that integrates multiple AI models (OpenAI GPT-5.2, Google Gemini 3 Flash Preview, X.AI Grok 3, ByteDance Doubao Seed 1.8) to provide multi-perspective responses in a single thread. Built with Python and the Slack Bolt framework.
-
Multi-Model Integration 🔌
- Support for OpenAI (GPT-5.2), Google Gemini (3 Flash Preview), X.AI (Grok 3), and ByteDance (Doubao Seed 1.8)
- Adapter pattern for easy extensibility (Claude, etc.)
- Unified interface for all AI models
-
Dynamic Identity Override 🎭
- Single Slack bot token with multiple AI personas
- Custom username and emoji for each model
- Requires
chat:write.customizepermission
-
Context Isolation & Filtering 🔒
- Each AI model only sees:
- User messages
- Its own previous responses
- Other AI models' responses are filtered out in Compare mode
- Each AI model only sees:
-
Async Concurrent Responses ⚡
- Multiple AI models respond simultaneously
- Non-blocking execution for fast responses
-
Operation Modes 🎮
- Compare Mode (default): All models respond concurrently to user questions
- Debate Mode: Models respond sequentially, seeing each other's responses
-
Interactive Follow-up Questions 💬
- Each AI response includes a follow-up button
- Click to open a modal and ask follow-up questions to a specific model
- Follow-up questions are posted to the thread for visibility
- Only the selected model responds to follow-up questions
slack-ai-council/
├── app.py # Main Slack bot application
├── llm_manager.py # LLM adapter pattern implementation
├── context_filter.py # Message filtering for context isolation
├── mode_manager.py # Compare vs Debate mode management
├── pyproject.toml # Project metadata and dependencies (uv)
├── uv.lock # Locked dependencies (uv)
├── .env.example # Environment variables template
└── README.md # This file
- Adapter Pattern:
llm_manager.pyprovides a unified interface for different AI APIs - Strategy Pattern:
mode_manager.pyhandles different execution strategies (Compare vs Debate) - Filter Pattern:
context_filter.pyfilters messages based on model context needs
- Python 3.10 or higher
- Slack workspace with admin access
- API keys for desired AI models:
- OpenAI API key
- Google API key (for Gemini)
- X.AI API key (for Grok)
- ByteDance API key (for Doubao)
git clone https://github.com/coolestowl/slack-ai-council.git
cd slack-ai-councilThis project uses uv for fast, reliable dependency management.
# Install uv if you haven't already
pip install uv
# Install project dependencies
uv syncAlternatively, you can still use pip with the legacy requirements.txt:
pip install -r requirements.txt- Go to Slack API Apps
- Click "Create New App" → "From scratch"
- Name it "AI Council" and select your workspace
In OAuth & Permissions, add these Bot Token Scopes:
app_mentions:read- Read mentionschannels:history- Read channel messageschat:write- Send messageschat:write.customize- Customize message appearance (username/icon)im:history- Read DM historyim:write- Send DMs
- Go to Socket Mode and enable it
- Generate an app-level token with
connections:writescope - Save the token (starts with
xapp-)
- Go to Event Subscriptions and enable events
- Subscribe to bot events:
app_mentionmessage.im
- Go to Install App
- Install to your workspace
- Copy the Bot User OAuth Token (starts with
xoxb-)
cp .env.example .envEdit .env and add your tokens and API keys:
# Slack Configuration
SLACK_BOT_TOKEN=xoxb-your-bot-token-here
SLACK_APP_TOKEN=xapp-your-app-token-here
# AI Model API Keys
OPENAI_API_KEY=sk-your-openai-key
GOOGLE_API_KEY=your-google-api-key
XAI_API_KEY=your-xai-key
DOUBAO_API_KEY=your-doubao-api-key
# Optional Configuration
DEFAULT_MODE=compare # compare or debateNote: You need at least one AI API key for the bot to work. Missing API keys will be skipped with a warning.
Using uv (recommended):
uv run python app.pyOr directly with Python (if dependencies are already installed):
python app.pyYou should see:
✓ Initialized openai adapter
✓ Initialized gemini adapter
✓ Initialized grok adapter
✓ Initialized doubao adapter
==================================================
Slack AI Council Bot Starting
==================================================
Configured AI Models: openai, gemini, grok, doubao
Default Mode: COMPARE
Mode Description: Compare Mode: All AI models respond concurrently
==================================================
@AI Council What's the best approach to learn machine learning?
All configured AI models will respond in a thread with their perspectives using Compare mode (concurrent responses, default).
Specify debate mode for sequential AI responses:
@AI Council mode=debate What is the future of artificial intelligence?
Or explicitly use compare mode:
@AI Council mode=compare Explain quantum computing
Modes:
- Compare mode (default): All AI models respond concurrently, each seeing only user messages and their own history
- Debate mode: AI models respond sequentially, seeing all previous responses
Each AI model's response includes a "追问" (Follow-up) button. Click the button to:
- Open a modal dialog specific to that model
- Type your follow-up question
- Submit to get a response from only that model
- The follow-up question and response appear in the thread
This allows you to have focused conversations with individual models without triggering all models.
Adding a new AI model is simple - just create a new adapter class in llm_manager.py:
class ClaudeAdapter(LLMAdapter):
adapter_key = "claude" # Unique identifier for this adapter
def __init__(self):
super().__init__(
model_name="claude-3-5-sonnet-20241022",
username="Claude-3.5",
icon_emoji=":brain:"
)
self.api_key = os.getenv("ANTHROPIC_API_KEY")
if not self.api_key:
raise ValueError("ANTHROPIC_API_KEY not found in environment variables")
async def generate_response(self, messages: List[Dict[str, str]]) -> str:
# Implementation here
passThat's it! The adapter will be automatically discovered and registered. No need to modify other files.
Edit the create_default_system_prompt() function in context_filter.py to customize how each model behaves.
- app.py: Main entry point, Slack event handlers
- llm_manager.py: AI model adapters and management
- context_filter.py: Message filtering logic
- mode_manager.py: Operation mode management
handle_app_mention(): Processes @mentions in channelshandle_message(): Processes direct messageshandle_compare_mode(): Concurrent AI responseshandle_debate_mode(): Sequential AI responsesfilter_messages_for_model(): Context isolation per model
- The bot uses Socket Mode, so it doesn't require a public URL
- Each AI model only sees user messages and its own history (in Compare mode)
- In Debate mode, models see all previous messages including other AI responses
- Missing API keys are handled gracefully - those models are skipped
Check that at least one API key is correctly set in your .env file.
- Verify Socket Mode is enabled
- Check that Event Subscriptions are configured
- Ensure the bot is invited to the channel:
/invite @AI Council - Check the console for error messages
Ensure your Slack app has the chat:write.customize permission.
See LICENSE file for details.
Contributions welcome! Feel free to:
- Add support for more AI models
- Improve context filtering algorithms
- Enhance debate mode functionality
- Add unit tests
- Advanced debate mode with rounds
- Vote/rating system for best responses
- Per-channel mode configuration
- Response caching
- Streaming responses
- Web dashboard for analytics