Skip to content

Commit 662e6fb

Browse files
authored
fix: max complete token for openai gen structured (#438)
1 parent 81657a1 commit 662e6fb

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

src/mcp_agent/workflows/llm/augmented_llm_openai.py

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -486,8 +486,16 @@ async def generate_structured(
486486
"model": model,
487487
"messages": messages,
488488
"response_format": response_format,
489-
"max_tokens": params.maxTokens,
490489
}
490+
491+
# Use max_completion_tokens for reasoning models, max_tokens for others
492+
if self._reasoning(model):
493+
# DEPRECATED: https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens
494+
# "max_tokens": params.maxTokens,
495+
payload["max_completion_tokens"] = params.maxTokens
496+
payload["reasoning_effort"] = self._reasoning_effort
497+
else:
498+
payload["max_tokens"] = params.maxTokens
491499
user = params.user or getattr(self.context.config.openai, "user", None)
492500
if user:
493501
payload["user"] = user

0 commit comments

Comments
 (0)