Standardize & Abstract Prompt/Context Caching #848 #30731
rvndbalaji
announced in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Hi,
I noticed that Context Caching is supported by both Gemini and Claude but both of the syntax is different
Is it possible to abstract away this and allow a general purpose
cache_options
that caters to different LLM providers?LangChain is supposed to be the abstraction that performs all of this, and I was hoping this can be implemented
Motivation
I need to implement caching mechanism for both Gemini and Anthropic, but I want this to be vendor neutral
FYI - @lkuligin @baskaryan @efriis
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions