Skip to content

Add promptCacheKey for any provider with npm @ai-sdk/openai#4413

Closed
shantur wants to merge 1 commit intoanomalyco:devfrom
shantur:openai-compatible-cache
Closed

Add promptCacheKey for any provider with npm @ai-sdk/openai#4413
shantur wants to merge 1 commit intoanomalyco:devfrom
shantur:openai-compatible-cache

Conversation

@shantur
Copy link
Contributor

@shantur shantur commented Nov 17, 2025

v2 for openai-caching when npm uses @ai-sdk/openai
Includes fix for #4386

@rekram1-node
Copy link
Collaborator

Yeah this one should be fine, but im curious what is your need for this? @shantur what provider are you using?

@shantur
Copy link
Contributor Author

shantur commented Nov 17, 2025

@rekram1-node - openai codex via CLIProxyAPI

@shantur
Copy link
Contributor Author

shantur commented Nov 17, 2025

Having said that this is needed for any provider that supports openai APIs

@rekram1-node
Copy link
Collaborator

@shantur hmm but this will still cause errors for certain people, I know people that have special proxies setup internally that use that openai provider but it goes through a proxy, it will error if you set this across the board.

Also a lot of providers handle the caching automatically, like openrouter for example doesn't need that key afaik (could be wrong)

@rekram1-node
Copy link
Collaborator

btw you can set this using a plugin if you need to

@shantur shantur force-pushed the openai-compatible-cache branch from 9c593ca to 78e74e7 Compare November 17, 2025 17:49
@shantur
Copy link
Contributor Author

shantur commented Nov 17, 2025

@rekram1-node - As per the official OpenAI and ai-sdk/openai docs this is the expected behaviour - https://ai-sdk.dev/providers/ai-sdk-providers/openai#responses-models

If people are using for their setups then it's incorrect setup and they should be using - https://ai-sdk.dev/providers/openai-compatible-providers with "@ai-sdk/openai-compatible"

@rekram1-node
Copy link
Collaborator

openai compatible doesnt support responses api, so they cant

@shantur
Copy link
Contributor Author

shantur commented Nov 17, 2025

What would be your suggestion how this could be handled?
Some flag in the provider config? like enablePromptCache which is by default not enabled and when enable allows promptCacheKey

@rekram1-node
Copy link
Collaborator

@shantur maybe not the perfect solution but this should work:

import { Plugin } from "./index"

export const SetCacheKey: Plugin = async (ctx) => {
  return {
    "chat.params": async (input, output) => {
      if (input.provider.npm && input.provider.npm === "@ai-sdk/openai") {
        output.options["promptCacheKey"] = input.sessionID
      }
    },
  }
}

@shantur
Copy link
Contributor Author

shantur commented Nov 17, 2025

@rekram1-node
Thanks for the workaround but I feel this is such a core feature that should be handled in the provider config where we are defining the type of the provider.

@rekram1-node
Copy link
Collaborator

thats fair, wanna add a setting then and we can do that?

@duggthangs
Copy link

If people are using for their setups then it's incorrect setup and they should be using - https://ai-sdk.dev/providers/openai-compatible-providers with "@ai-sdk/openai-compatible"

So "Fireworks.ai" is a provider that is supported by https://ai-sdk.dev/providers/ai-sdk-providers/fireworks and it complains about this promptCacheKey being there.

I raised the original issue that called for the removal of that code in #4386, which fixed it. This change just adds it back in.

@shantur Does this handle the use case for #4386?

@shantur
Copy link
Contributor Author

shantur commented Nov 23, 2025

@rekram1-node - Closing this in favor of #4654

@shantur shantur closed this Nov 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants