forked from FradSer/mcp-server-mas-sequential-thinking
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy path.env.example
More file actions
36 lines (32 loc) · 1.42 KB
/
.env.example
File metadata and controls
36 lines (32 loc) · 1.42 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
# --- LLM Configuration ---
# Select the LLM provider: "deepseek" (default), "groq", "openrouter", "github", or "ollama"
LLM_PROVIDER="deepseek"
# Provide the API key for the chosen provider:
# GROQ_API_KEY="your_groq_api_key"
DEEPSEEK_API_KEY="your_deepseek_api_key"
# OPENROUTER_API_KEY="your_openrouter_api_key"
# GITHUB_TOKEN="ghp_your_github_personal_access_token"
# Note: Ollama requires no API key but needs local installation
# Note: GitHub Models requires a GitHub Personal Access Token with appropriate scopes
# Optional: Base URL override (e.g., for custom endpoints)
LLM_BASE_URL="your_base_url_if_needed"
# Optional: Specify different models for Team Coordinator and Specialist Agents
# Defaults are set within the code based on the provider if these are not set.
# Example for Groq:
# GROQ_TEAM_MODEL_ID="llama3-70b-8192"
# GROQ_AGENT_MODEL_ID="llama3-8b-8192"
# Example for DeepSeek:
# DEEPSEEK_TEAM_MODEL_ID="deepseek-chat"
# DEEPSEEK_AGENT_MODEL_ID="deepseek-coder"
# Example for GitHub Models:
# GITHUB_TEAM_MODEL_ID="openai/gpt-5"
# GITHUB_AGENT_MODEL_ID="openai/gpt-5-min"
# Example for OpenRouter:
# OPENROUTER_TEAM_MODEL_ID="anthropic/claude-3-haiku-20240307"
# OPENROUTER_AGENT_MODEL_ID="google/gemini-flash-1.5"
# Example for Ollama:
# OLLAMA_TEAM_MODEL_ID="devstral:24b"
# OLLAMA_AGENT_MODEL_ID="devstral:24b"
# --- External Tools ---
# Required ONLY if the Researcher agent is used and needs Exa
EXA_API_KEY="your_exa_api_key"