Add Prompt Caching Feature for Anthropic and OpenAI clients #29660
smartinezbragado
announced in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Add prompt caching features from anthropic and openai:
Motivation
This feature is key to reduce the cost of LLM calls
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions