AWS Bedrock Cached Tokens Info in Langfuse #10898
Unanswered
fabriciojoc
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Does langfuse have fields we can send this to? We do track this already and use it for calculating costs. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello there! First of all, congrats for this awesome project!
I've been using it connected to AWS Bedrock (Claude 3.7) and Langfuse 3.29.0. I recently enabled prompt caching, but I do not see any cache info in Langfuse (like
cached_creation_input_tokens
orcache_read_input_tokens
as documented by AWS https://aws.amazon.com/blogs/machine-learning/effectively-use-prompt-caching-on-amazon-bedrock/).Am I missing a configuration or is it not supported by LiteLLM yet? I noticed in the integration code that you are getting only the
prompt_tokens
andcompletion_tokens
, shouldn't it include the cached tokens there?Thanks!
Beta Was this translation helpful? Give feedback.
All reactions