Supporting prompt caching in BedrockChatCompletion #12742
joshuamosesb
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
as posted in Q&A item support for Prompt Caching in BedrockChatCompletion
one approach could be, in bedrock_chat_completion.py: _prepare_settings_for_request based the
self.ai_model_id
and whether it supports prompt-caching, all the 4 functions mapped in MESSAGE_CONVERTERS dict could be updated / overloaded (either with optional parameter or to support prompt-caching by default) to include thecachePoint
markerBeta Was this translation helpful? Give feedback.
All reactions