Token cost is counted even if langchains LLM cache is utilized #2111
Answered
by
marcklingen
arthurGrigo
asked this question in
Support
-
|
Not sure if this behaviour is intended as it is but when I use langchains LLM cache and get_openai_callback() it shows me total cost of 0$ for prompts I executed before. In langfuse I see the cost of the prompt as it would have been sent so the API. @marcklingen Bug or feature? |
Beta Was this translation helpful? Give feedback.
Answered by
marcklingen
May 21, 2024
Replies: 1 comment
-
|
I’d go for bug here, thanks for reporting! Desiree behavior
Tracking this here: #2112 |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
marcklingen
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I’d go for bug here, thanks for reporting!
Desiree behavior
Tracking this here: #2112