Replies: 1 comment
-
Ya: https://python.langchain.com/docs/how_to/chat_models_universal_init/#creating-a-configurable-model
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
In LangGraph, is it possible to use configuration parameters (like model_name or system messages) at the graph level rather than only inside a specific function like call_model? My issue with the current approach is that every time we call the agent, the call_model function is triggered and the model gets loaded again, which is time-consuming. I’d like to load the model once outside the function and still use configuration — is that possible?
Beta Was this translation helpful? Give feedback.
All reactions