You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/langgraph-platform/assistants.mdx
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,8 @@ The LangGraph Cloud API provides several endpoints for creating and managing ass
17
17
18
18
## Configuration
19
19
20
-
Assistants build on the LangGraph open source concept of [configuration](/oss/graph-api#configuration).
20
+
Assistants build on the LangGraph open source concept of [configuration](/oss/graph-api#runtime-context).
21
+
21
22
While configuration is available in the open source LangGraph library, assistants are only present in [LangGraph Platform](/langgraph-platform/index). This is due to the fact that assistants are tightly coupled to your deployed graph. Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings.
22
23
23
24
In practice, an assistant is just an _instance_ of a graph with a specific configuration. Therefore, multiple assistants can reference the same graph but can contain different configurations (e.g. prompts, models, tools). The LangGraph Server API provides several endpoints for creating and managing assistants. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) and [this how-to](/langgraph-platform/configuration-cloud) for more details on how to create assistants.
Copy file name to clipboardExpand all lines: src/langgraph-platform/configuration-cloud.mdx
+25-32Lines changed: 25 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,21 +4,20 @@ sidebarTitle: Manage assistants
4
4
---
5
5
In this guide we will show how to create, configure, and manage an [assistant](/langgraph-platform/assistants).
6
6
7
-
First, as a brief refresher on the concept of configurations, consider the following simple `call_model` node and configuration schema. Observe that this node tries to read and use the `model_name` as defined by the `config` object's `configurable`.
7
+
First, as a brief refresher on the concept of context, consider the following simple `call_model` node and context schema.
8
+
Observe that this node tries to read and use the `model_name` as defined by the `context` object's `model_name` field.
// We return a list, because this will get added to the existing list
44
42
return { messages: [response] };
@@ -55,7 +53,7 @@ For more information on configurations, [see here](/langgraph-platform/configura
55
53
56
54
To create an assistant, use the [LangGraph SDK](/langgraph-platform/sdk) `create` method. See the [Python](/langgraph-platform/python-sdk#langgraph_sdk.client.AssistantsClient.create) and [JS](/langgraph-platform/js-ts-sdk#create) SDK reference docs for more information.
57
55
58
-
This example uses the same configuration schema as above, and creates an assistant with `model_name` set to `openai`.
56
+
This example uses the same context schema as above, and creates an assistant with `model_name` set to `openai`.
59
57
60
58
<Tabs>
61
59
<Tab title="Python">
@@ -65,7 +63,7 @@ This example uses the same configuration schema as above, and creates an assista
65
63
client =get_client(url=<DEPLOYMENT_URL>)
66
64
openai_assistant =awaitclient.assistants.create(
67
65
# "agent" is the name of a graph we deployed
68
-
"agent", config={"configurable": {"model_name":"openai"}}, name="Open AI Assistant"
66
+
"agent", context={"model_name":"openai"}, name="Open AI Assistant"
69
67
)
70
68
71
69
print(openai_assistant)
@@ -79,7 +77,7 @@ This example uses the same configuration schema as above, and creates an assista
@@ -239,7 +235,7 @@ To edit the assistant, use the `update` method. This will create a new version o
239
235
240
236
<Note>
241
237
**Note**
242
-
You must pass in the ENTIRE config (and metadata if you are using it). The update endpoint creates new versions completely from scratch and does not rely on previous versions.
238
+
You must pass in the ENTIRE context (and metadata if you are using it). The update endpoint creates new versions completely from scratch and does not rely on previous versions.
243
239
</Note>
244
240
245
241
For example, to update your assistant's system prompt:
@@ -249,11 +245,9 @@ For example, to update your assistant's system prompt:
0 commit comments