You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 19, 2026. It is now read-only.
2.**[Result types](/concepts/tasks/task-results)**: We use a list of strings as the `result_type` to constrain the output to one of the predefined categories. This ensures that the classification result is always one of the specified options.
Copy file name to clipboardExpand all lines: docs/guides/llms.mdx
+48-54Lines changed: 48 additions & 54 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,38 +19,72 @@ Every ControlFlow agent can be assigned a specific LLM. When instantiating an ag
19
19
ControlFlow agents can use any LangChain LLM class that supports chat-based APIs and tool calling. For a complete list of available models, settings, and instructions, please see LangChain's [LLM provider documentation](https://python.langchain.com/docs/integrations/chat/).
20
20
21
21
<Tip>
22
-
ControlFlow includes OpenAI and Azure OpenAI models by default. To use other models, you'll need to first install the corresponding LangChain package and supply any required credentials. See the model's [documentation](https://python.langchain.com/docs/integrations/chat/) for more information.
22
+
ControlFlow includes the required packages for OpenAI, Azure OpenAI, and Anthropic models by default. To use other models, you'll need to first install the corresponding LangChain package and supply any required credentials. See the model's [documentation](https://python.langchain.com/docs/integrations/chat/) for more information.
23
23
</Tip>
24
24
25
25
26
+
### Automatic configuration
27
+
28
+
ControlFlow can automatically load LLMs from certain providers, based on a parameter. The model parameter must have the form `{provider key}/{model name}`.
Note that loading a model from a string is convenient, but does not allow you to configure all of the model's parameters. For full control, see the docs on [manual configuration](#manual-configuration).
40
+
41
+
At this time, supported providers for automatic configuration include:
If the required dependencies are not installed, ControlFlow will be unable to load the model and will raise an error.
52
+
53
+
54
+
### Manual configuration
55
+
56
+
26
57
To configure a different LLM, follow these steps:
27
58
<Steps>
28
59
<Steptitle="Install required packages">
29
-
To use an LLM, first make sure you have installed the appropriate provider package. ControlFlow only includes `langchain_openai` by default. For example, to use an Anthropic model, first run:
30
-
```
31
-
pip install langchain_anthropic
60
+
To use an LLM, first make sure you have installed the appropriate [provider package](https://python.langchain.com/docs/integrations/chat/). For example, to use a Google model, run:
61
+
62
+
```bash
63
+
pip install langchain_google_genai
32
64
```
33
65
</Step>
34
66
<Steptitle="Configure API keys">
35
-
You must provide the correct API keys and configuration for the LLM you want to use. These can be provided as environment variables or when you create the model in your script. For example, to use an Anthropic model, set the `ANTHROPIC_API_KEY` environment variable:
67
+
You must provide the correct API keys and configuration for the LLM you want to use. These can be provided as environment variables or when you create the model in your script. For example, to use an OpenAI model, you must set the `OPENAI_API_KEY` environment variable:
36
68
69
+
```bash
70
+
export OPENAI_API_KEY=<your-api-key>
37
71
```
38
-
export ANTHROPIC_API_KEY=<your-api-key>
39
-
```
40
-
For model-specific instructions, please refer to the provider's documentation.
72
+
For model-specific instructions, please refer to the provider's [documentation](https://python.langchain.com/docs/integrations/chat/).
41
73
</Step>
74
+
42
75
<Steptitle="Create the model">
43
-
Begin by creating the LLM object in your script. For example, to use Claude 3 Opus:
76
+
Create the LLM model in your script, including any additional parameters. For example, to use Claude 3 Opus:
44
77
45
78
```python
46
79
from langchain_anthropic import ChatAnthropic
47
80
48
81
# create the model
49
82
model = ChatAnthropic(model='claude-3-opus-20240229')
50
83
```
84
+
51
85
</Step>
52
86
<Steptitle="Pass the model to an agent">
53
-
Next, create an agent with the specified model:
87
+
Finally, configure an agent with the model:
54
88
55
89
```python
56
90
import controlflow as cf
@@ -59,40 +93,8 @@ import controlflow as cf
59
93
agent = cf.Agent(model=model)
60
94
```
61
95
</Step>
62
-
<Steptitle='Assign the agent to a task'>
63
-
Finally, assign your agent to a task:
64
-
65
-
```python
66
-
# assign the agent to a task
67
-
task = cf.Task('Write a short poem about LLMs', agents=[agent])
68
-
69
-
# (optional) run the task
70
-
task.run()
71
-
```
72
-
</Step>
73
96
</Steps>
74
97
75
-
<Accordiontitle="Click here to copy the entire example script">
76
-
77
-
```python
78
-
import controlflow as cf
79
-
from langchain_anthropic import ChatAnthropic
80
-
81
-
# create the model
82
-
model = ChatAnthropic(model='claude-3-opus-20240229')
83
-
84
-
# provide the model to an agent
85
-
agent = cf.Agent(model=model)
86
-
87
-
# assign the agent to a task
88
-
task = cf.Task('Write a short poem about LLMs', agents=[agent])
89
-
90
-
# (optional) run the task
91
-
task.run()
92
-
```
93
-
</Accordion>
94
-
95
-
### Model configuration
96
98
97
99
In addition to choosing a specific model, you can also configure the model's parameters. For example, you can set the temperature for GPT-4o:
You can also specify a default model using a string, which is convenient though it doesn't allow you to configure advanced model settings. The string must have the form `<provider>/<model name>`.
138
+
You can also specify a default model using a string, which is convenient though it doesn't allow you to configure advanced model settings. This must be a string in the form `{provider key}/{model name}`, following the same guidelines as [automatic LLM configuration](#automatic-configuration).
137
139
138
140
You can apply this setting either by using an environment variable before you import ControlFlow or in your script at runtime. For example, to use GPT 3.5 Turbo as the default model:
The default model can only be set by environment variable before importing ControlFlow. Once ControlFlow is imported, it reads the `controlflow.settings.llm_model` value to create the default model object.
157
159
</Note>
158
-
159
-
160
-
At this time, setting the default model via string is only supported for the following providers:
0 commit comments