Skip to content

[BUG] v1.12.0: JSON parse error with Claude models via OpenAI-compatible endpoints (LiteLLM/OpenRouter) #959

@bytefrostdev

Description

@bytefrostdev

Description

After upgrading to v1.12.0, Claude models (e.g. claude-haiku-4.5) fail with JSON parse errors when used via OpenAI-compatible endpoints like LiteLLM or OpenRouter. The same setup worked perfectly in v1.11.x.

Error

SyntaxError: Unexpected token '`', "```json
{
"... is not valid JSON
    at JSON.parse (<anonymous>)
    at async p.generateObject (...)

Root Cause

The migration from LangChain to Vercel AI SDK in v1.12.0 introduced this regression. The generateObject() function expects raw JSON, but Claude (and potentially other models) often wrap their JSON output in markdown code blocks when accessed via OpenAI-compatible proxies.

LangChain was tolerant of this. Vercel AI SDK's generateObject() is not.

Workaround

Switching to Gemini models (e.g. gemini-2.5-flash) works - they return raw JSON without markdown wrapping.

Suggested Fix

Strip markdown code block wrappers from LLM responses before parsing JSON.

Environment

  • Perplexica: v1.12.0 (slim image)
  • Provider: LiteLLM (OpenAI-compatible endpoint)
  • Chat model: claude-haiku-4.5

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions