This guide covers the post-installation configuration steps needed to get Expanto fully operational with your data warehouse and AI services.
Legend:
- ⭐ REQUIRED — must be configured for the system to work
- 🔹 OPTIONAL — has sensible defaults, customize as needed
After running make setup, you'll have configuration templates in .streamlit/ directory that need to be customized for your environment:
.streamlit/secrets.toml— sensitive credentials (DWH, API keys).streamlit/expanto.toml— application settings.streamlit/config.toml— Streamlit UI configuration
Configure at least one data warehouse in .streamlit/secrets.toml:
[bigquery]
file_path = "path/to/your-service-account.json"
project_name = "your-project-id"
connection_type = "service_account" # or "application_default"Steps:
- Create a service account in Google Cloud Console
- Download the JSON key file
- Grant BigQuery permissions:
BigQuery Data Viewer,BigQuery Job User - Update
file_pathwith absolute path to JSON file - Set
connection_type = "service_account"
[snowflake]
account = "your-account.region.cloud"
user = "your-username"
password = "your-password"
warehouse = "your-warehouse"
database = "your-database"
schema = "your-schema"Steps:
- Ensure your Snowflake user has access to the target database/schema
- Test connection with the provided credentials
- Verify the account identifier format (include region and cloud provider)
Add your AI provider API key to .streamlit/secrets.toml:
[api_keys]
PROVIDER_API_KEY = "your-together-ai-api-key" # ⭐ REQUIRED
TAVILY_API_KEY = "your-tavily-key" # 🔹 OPTIONAL - for web search
LOGFIRE_TOKEN = "your-logfire-token" # 🔹 OPTIONAL - for observabilitySteps:
- ⭐ REQUIRED: Sign up at Together.ai and get an API key
- ⭐ REQUIRED: Add the key to
PROVIDER_API_KEY - 🔹 Optional: Get Tavily API key for web search capabilities
- 🔹 Optional: Get Logfire token for observability
Customize AI models in .streamlit/expanto.toml (default models work fine):
[assistant.models]
fast = "deepseek-ai/DeepSeek-V3"
tool_thinker = "Qwen/Qwen3-235B-A22B-Thinking-2507"
agentic = "moonshotai/Kimi-K2-Instruct"AI Assistant ConfigurationSet which data warehouse to use for calculations in .streamlit/expanto.toml:
[precompute_db]
name = "snowflake" # or "bigquery"The SQLite configuration is pre-configured and works out of the box. Only customize if needed:
[internal_db]
engine_str = "sqlite:///expanto.db"
async_engine_str = "sqlite+aiosqlite:///expanto.db"
[internal_db.connect_args]
pool_size = 5
max_overflow = 5
pool_timeout = 10
pool_recycle = 900
connect_args = { check_same_thread = false }