-
Notifications
You must be signed in to change notification settings - Fork 2k
Closed
Labels
bugSomething isn't workingSomething isn't workingepicinvalidThis doesn't seem rightThis doesn't seem right
Milestone
Description
When the tool calling is used, the Chat Model's Chat response metadata doesn't accumulate the usage metrics (accumulated prompt, chat completion and total tokens) from all the Chat responses involved - including the tool calling.
This epic addresses the fix needed for all the supported models.
- AnthropicChatModel : Fix Anthropic chat model functioncalling token usage #1918
- AzureOpenAiChatModel : Fix Azure OpenAI chat model function calling token usage #1916
- BedrockProxyChatModel : Using BedrockProxyChatModel with tools, chatResponse.metadata.usage is returning ONLY the last model invocation usage #1743
- OpenAiChatModel : Fix OpenAI ChatResponse usage calculation when toolcalling is used #1872
- MiniMaxChatModel
- MistralAiChatModel : Fix Mistral AI Chat model function call usage calculation #1905
- MoonshotChatModel
- OllamaChatModel
- VertexAiGeminiChatModel
- ZhiPuAiChatModel
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingepicinvalidThis doesn't seem rightThis doesn't seem right