How to set the Cache? #1230
Replies: 3 comments 3 replies
-
are you trying to cache LLM calls or ALL of langgraph? If LLM calls, it should work the same (assuming you are using LangChain LLMs). If all of LangGraph - not currently supported, but its on our roadmap. OOC - what is the motivation there? |
Beta Was this translation helpful? Give feedback.
-
Any update on the Langgraph cache..I am looking for all calls. |
Beta Was this translation helpful? Give feedback.
-
Same question. If I invoke the LLM directly, it uses the cache. However, If I compile a graph with the same LLM inside (I even pass the cache argument in compile() ), the LLM does not use the cache across runs even the input is the same. I wonder why it is the case. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In Langchain, we can set the LLM Cache:
But I see that just this setting is not sufficient for LangGraph to use the same cache.
Am I doing something wrong, or maybe this is not implemented for langgraph?
Beta Was this translation helpful? Give feedback.
All reactions