-
Notifications
You must be signed in to change notification settings - Fork 139
[gpt-5.*] 'max_tokens' is not supported with this model, when using azure BYOK #435
Copy link
Copy link
Open
Description
When routing to Azure via BYOK, OpenRouter forwards max_tokens as-is instead of translating it to max_completion_tokens.
Since GPT-5* endpoint has deprecated max_tokens, the call will fall
{
"error": {
"message": "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.",
"type": "invalid_request_error",
"param": "max_tokens",
"code": "unsupported_parameter"
}
}Response metadata:
{
"provider_name": "Azure",
"is_byok": true
}Setup:
- Model:
openai/gpt-5.2-chat - Provider routing:
only: ["azure"] - Using
@openrouter/ai-sdk-provider@2.1.1andai@6.0.49(Vercel AI SDK) - No
max_tokensormax_completion_tokensexplicitly set in the request
Expected behavior:
OpenRouter should translate max_tokens to max_completion_tokens when forwarding to Azure for GPT-5 models, as it does for non-BYOK requests.
Workaround:
Routing to a non-Azure BYOK provider avoids the issue.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels