Skip to content

Conversation

bhouston
Copy link
Member

@bhouston bhouston commented Mar 3, 2025

This PR implements token caching for the Vercel AI SDK with Anthropic provider. It adds the appropriate providerOptions.anthropic.cacheControl: 'ephemeral' property to the last two messages in the conversation, which allows the conversation up to that point to be cached (with a ~5 minute window), reducing token consumption during repeated API calls.\n\nFixes #58

@bhouston bhouston force-pushed the feature/issue-58-token-caching branch from 15ac90d to 870cbee Compare March 3, 2025 21:18
@bhouston bhouston merged commit 73604de into main Mar 4, 2025
1 check failed
@bhouston bhouston deleted the feature/issue-58-token-caching branch March 12, 2025 02:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant