Skip to content

fix: map Responses API text.format to response_format in chat completions conversion#5621

Open
ink-the-squid wants to merge 1 commit intoHelicone:mainfrom
ink-the-squid:fix/responses-api-structured-output
Open

fix: map Responses API text.format to response_format in chat completions conversion#5621
ink-the-squid wants to merge 1 commit intoHelicone:mainfrom
ink-the-squid:fix/responses-api-structured-output

Conversation

@ink-the-squid
Copy link
Copy Markdown
Contributor

Problem

When using the Responses API (/v1/responses) through the AI Gateway with BYOK Azure endpoints, structured output (text.format with json_schema) is silently dropped. The provider returns plain text instead of structured JSON, breaking client.responses.parse() and Pydantic validation.

Works: Azure direct, OAI proxy, raw OpenAI
Fails: AI Gateway BYOK → Azure

Root Cause

The toChatCompletions() function converts Responses API requests to Chat Completions format for providers that don't natively support the Responses API (like Azure). It was mapping body.response_format (Chat Completions field) but ignoring body.text.format (where structured output is actually specified in the Responses API).

In the Responses API, structured output is:

{"text": {"format": {"type": "json_schema", "schema": {...}}}}

In Chat Completions, it's:

{"response_format": {"type": "json_schema", "json_schema": {"schema": {...}}}}

The conversion was missing entirely.

Fix

  • Added format field to ResponsesRequestBody.text type definition
  • Map body.text.formatresponse_format in toChatCompletions()
  • Handles json_schema type with name, description, schema, strict
  • Falls back to body.response_format for backward compatibility

Files Changed

  • packages/llm-mapper/transform/types/responses.ts — added text.format type
  • packages/llm-mapper/transform/providers/responses/request/toChatCompletions.ts — mapping logic

…ions conversion

When the AI Gateway converts Responses API requests to Chat Completions
format for providers like Azure that don't natively support the Responses
API, the structured output schema (text.format) was being dropped.

The toChatCompletions() function only mapped body.response_format (which
doesn't exist in Responses API requests) and ignored body.text.format
(where structured output is actually specified in the Responses API).

This caused structured output requests through AI Gateway BYOK to return
plain text instead of JSON, breaking Pydantic validation on the client
side.

Changes:
- Add text.format type to ResponsesRequestBody (was missing)
- Map body.text.format -> response_format in toChatCompletions()
- Handles json_schema type with name, description, schema, strict fields
- Falls back to body.response_format for backward compatibility

Fixes: AI Gateway structured output dropping for Azure BYOK endpoints
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 3, 2026

@ink-the-squid is attempting to deploy a commit to the Helicone Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant