fix: use Anthropic's native output_format for structured output#26
Closed
03-CiprianoG wants to merge 3 commits intoHelicone:mainfrom
Closed
fix: use Anthropic's native output_format for structured output#2603-CiprianoG wants to merge 3 commits intoHelicone:mainfrom
03-CiprianoG wants to merge 3 commits intoHelicone:mainfrom
Conversation
When using `Output.object({ schema })` or `generateObject()` with
Anthropic models via the Helicone gateway, the AI SDK passes
`responseFormat: { type: "json", schema }` to `doGenerate()`.
Previously, this was always converted to OpenAI's
`response_format: { type: "json_schema", json_schema: { schema } }`
format. The Helicone gateway passes this through to Anthropic, but
Anthropic ignores the `response_format` field entirely since it's
not part of their API spec. The model then returns prose/markdown
instead of JSON, causing `NoObjectGeneratedError`.
This fix detects Anthropic models (modelId starting with
`anthropic/`) and uses Anthropic's native `output_format` parameter
instead:
```json
{
"output_format": {
"type": "json_schema",
"json_schema": {
"schema": { ... },
"name": "response"
}
}
}
```
Non-Anthropic models continue using `response_format` as before.
The Helicone gateway accepts model IDs in two formats: - `anthropic/claude-sonnet-4-6` (provider/model) - `claude-4.6-sonnet/anthropic` (model/provider) The Anthropic detection now checks both startsWith and endsWith to handle either format.
Use modelId.includes('anthropic') to handle any model ID format
(anthropic/model, model/anthropic, etc).
Author
|
Closing this PR — the root cause is in the AI Gateway, not the AI SDK provider. The gateway's Filed a proper issue and PR on the gateway repo instead:
The |
Author
|
AI Gateway (the real fix):
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #25
Problem
When using
Output.object({ schema })orgenerateObject()with Anthropic models via the Helicone gateway, the model returns prose instead of JSON becauseresponse_format(OpenAI's parameter) is sent but Anthropic ignores it.See #25 for full details, reproduction steps, and request logs.
Solution
Detect Anthropic models (
this.modelId.startsWith('anthropic/')) inbuildRequestBody()and use Anthropic's nativeoutput_formatparameter instead of OpenAI'sresponse_format.Before (broken for Anthropic)
{ "response_format": { "type": "json_schema", "json_schema": { "schema": {...}, "strict": true, "name": "response" } } }Anthropic ignores
response_format→ model returns prose →NoObjectGeneratedError.After (works for all providers)
Anthropic models → uses
output_format(native Anthropic API):{ "output_format": { "type": "json_schema", "json_schema": { "schema": {...}, "name": "response" } } }All other models → continues using
response_format(OpenAI-compatible), unchanged.Changes
src/helicone-language-model.ts: UpdatedbuildRequestBody()to branch onthis.modelId.startsWith('anthropic/')when handlingresponseFormat. Anthropic models getoutput_format, others getresponse_formatas before.Notes
output_formatdoes not supportstrict: true(OpenAI-specific), so it's omitted for Anthropic models.output_formatreaches Anthropic's API correctly.@ai-sdk/anthropichandles structured output natively (via theoutputFormatstructured output mode).