Skip to content

Commit cdd9aac

Browse files
committed
fix tokens calculation for bedrock models when cache tokens are present in response
1 parent 00b527c commit cdd9aac

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/providers/bedrock/chatComplete.ts

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -696,6 +696,10 @@ export const BedrockChatCompleteStreamChunkTransform: (
696696
cacheWriteInputTokens,
697697
completion_tokens: parsedChunk.usage.outputTokens,
698698
total_tokens: parsedChunk.usage.totalTokens,
699+
prompt_tokens_details: {
700+
cached_tokens: cacheReadInputTokens,
701+
},
702+
// we only want to be sending this for anthropic models and this is not openai compliant
699703
...((cacheReadInputTokens > 0 || cacheWriteInputTokens > 0) && {
700704
cache_read_input_tokens: parsedChunk.usage.cacheReadInputTokens,
701705
cache_creation_input_tokens:

0 commit comments

Comments
 (0)