Description
After upgrading to v1.12.0, Claude models (e.g. claude-haiku-4.5) fail with JSON parse errors when used via OpenAI-compatible endpoints like LiteLLM or OpenRouter. The same setup worked perfectly in v1.11.x.
Error
SyntaxError: Unexpected token '`', "```json
{
"... is not valid JSON
at JSON.parse (<anonymous>)
at async p.generateObject (...)
Root Cause
The migration from LangChain to Vercel AI SDK in v1.12.0 introduced this regression. The generateObject() function expects raw JSON, but Claude (and potentially other models) often wrap their JSON output in markdown code blocks when accessed via OpenAI-compatible proxies.
LangChain was tolerant of this. Vercel AI SDK's generateObject() is not.
Workaround
Switching to Gemini models (e.g. gemini-2.5-flash) works - they return raw JSON without markdown wrapping.
Suggested Fix
Strip markdown code block wrappers from LLM responses before parsing JSON.
Environment
- Perplexica: v1.12.0 (slim image)
- Provider: LiteLLM (OpenAI-compatible endpoint)
- Chat model: claude-haiku-4.5