Tokens used by 1 conversation - anthropic #4368
Unanswered
luciannuta
asked this question in
Q&A
Replies: 2 comments 1 reply
-
Yes but prompt caching should help with this. |
Beta Was this translation helpful? Give feedback.
1 reply
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, does having a longer conversation make it become increasingly costly because it always reads the previous messages? I feel like this is how anthropic api worked for me
Beta Was this translation helpful? Give feedback.
All reactions