You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/api-reference/llm/config-manager.md
+17-18Lines changed: 17 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,8 @@
2
2
3
3
The `ConfigurationManager` handles loading, validation, and management of LLM provider configurations from various sources including environment variables, configuration files, and runtime settings.
4
4
5
+
> **Note (Nov 2025):** The core SDK now defaults to environment-driven configuration. Use the `spoon-cli` configuration manager (or set environment variables manually) to sync `config.json` values before instantiating `ConfigurationManager()`.
6
+
5
7
## Class Definition
6
8
7
9
```python
@@ -57,8 +59,13 @@ Load configuration from a JSON or TOML file.
57
59
58
60
**Example:**
59
61
```python
62
+
import os
63
+
from spoon_ai.llm import ConfigurationManager
64
+
65
+
# Populate required environment variables before instantiating the manager
@@ -78,6 +85,13 @@ Load configuration from environment variables.
78
85
config = config_manager.load_from_env()
79
86
```
80
87
88
+
Environment overrides also control provider priority:
89
+
90
+
-`DEFAULT_LLM_PROVIDER` selects the preferred provider (e.g. `anthropic`).
91
+
-`LLM_FALLBACK_CHAIN` lists comma-separated providers for cascading retries (e.g. `anthropic,openai,gemini`).
92
+
93
+
When using `spoon-cli`, these variables are exported automatically after `config.json` loads. If you instantiate the SDK directly, set them yourself before calling `ConfigurationManager()`.
Copy file name to clipboardExpand all lines: docs/api-reference/llm/index.md
+27-12Lines changed: 27 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,6 +6,8 @@ The LLM (Large Language Model) system in SpoonOS provides a unified, provider-ag
6
6
7
7
SpoonOS's LLM system offers:
8
8
9
+
-> **Note (Nov 2025):** The core Python SDK reads provider settings from environment variables. The `spoon-cli` toolchain loads `config.json` and exports those values into the environment automatically. When using the SDK directly, set the relevant `*_API_KEY`, `*_BASE_URL`, and related environment variables before creating `ConfigurationManager()`.
10
+
9
11
-**Provider Agnosticism**: Unified API across all providers
10
12
-**Automatic Fallback**: Intelligent provider switching on failures
11
13
-**Load Balancing**: Distribute requests across multiple providers
@@ -59,10 +61,14 @@ Handles configuration loading, validation, and management from multiple sources.
After setting the variables, simply instantiate `ConfigurationManager()` as usual; no code changes are needed. The `spoon-cli` configuration workflow writes these variables for you whenever it loads `config.json`.
112
+
98
113
### Streaming Responses
99
114
100
115
```python
@@ -240,7 +255,7 @@ LLM_RETRY_ATTEMPTS=3
240
255
```python
241
256
from spoon_ai.llm import ConfigurationManager
242
257
243
-
config_manager = ConfigurationManager()
258
+
config_manager = ConfigurationManager()# uses environment variables by default
244
259
245
260
# Configure providers
246
261
config_manager.set_provider_config("openai", {
@@ -515,7 +530,7 @@ llm_manager.set_primary_provider("gemini") # Generally faster
515
530
```python
516
531
from spoon_ai.llm import ConfigurationManager
517
532
518
-
config_manager = ConfigurationManager()
533
+
config_manager = ConfigurationManager()# refreshes from environment variables
0 commit comments