Skip to content

fix: strip markdown code block wrappers before JSON parsing#1032

Open
MaxwellCalkin wants to merge 1 commit intoItzCrazyKns:masterfrom
MaxwellCalkin:fix/claude-json-code-blocks-959
Open

fix: strip markdown code block wrappers before JSON parsing#1032
MaxwellCalkin wants to merge 1 commit intoItzCrazyKns:masterfrom
MaxwellCalkin:fix/claude-json-code-blocks-959

Conversation

@MaxwellCalkin
Copy link

@MaxwellCalkin MaxwellCalkin commented Mar 8, 2026

Summary

Fixes #959

After the v1.12.0 migration from LangChain to Vercel AI SDK, Claude models (and potentially others) fail with SyntaxError: Unexpected token when accessed via OpenAI-compatible proxies like LiteLLM or OpenRouter. The root cause is that these models wrap JSON output in markdown code blocks (```json ... ```), and JSON.parse cannot handle the wrapper.

Changes

  • Added stripCodeBlockWrappers() helper function to both openaiLLM.ts and ollamaLLM.ts
  • Applied the function before repairJson/JSON.parse in generateObject() (both providers)
  • Applied the function before parse() in streamObject() final chunk handling (OpenAI provider)
  • The regex handles: ```json ... ```, ``` ... ```, and surrounding whitespace/newlines
  • Since Anthropic, Gemini, Groq, Lemonade, and LM Studio providers all extend OpenAILLM, the fix covers all OpenAI-compatible providers automatically

Why not just rely on repairJson({ extractJson: true })?

While repairJson with extractJson: true is already used, it doesn't reliably handle the markdown code block wrapper pattern (```json\n{...}\n```), as evidenced by the bug report. Adding explicit stripping before repair ensures robustness.

Test plan

  • Verify Claude models (e.g. claude-haiku-4.5) work via LiteLLM/OpenRouter without JSON parse errors
  • Verify models that return raw JSON (e.g. Gemini, GPT) continue to work normally
  • Verify Ollama models continue to work normally

Note: This PR was authored by Claude (AI), operated by @MaxwellCalkin.


Summary by cubic

Fixes JSON parsing failures by stripping markdown code block fences from model responses before parsing. Prevents SyntaxError when Claude and other OpenAI-compatible providers return fenced JSON via proxies like LiteLLM or OpenRouter.

  • Bug Fixes
    • Added stripCodeBlockWrappers() in openaiLLM.ts and ollamaLLM.ts.
    • Applied before repairJson/JSON.parse in generateObject (both providers).
    • Applied before parse for final text chunk in streamObject (OpenAI provider).
    • Handles json ... and generic ... with optional whitespace.
    • Covers providers extending OpenAILLM (e.g., Anthropic, Gemini, Groq, Lemonade, LM Studio).
    • Continues to use @toolsycc/json-repair with extractJson: true for robustness.

Written for commit 49a5a74. Summary will update on new commits.

…teObject

Claude and other models accessed via OpenAI-compatible proxies (LiteLLM,
OpenRouter) wrap JSON output in markdown code blocks (```json ... ```),
causing JSON.parse to fail with 'Unexpected token' errors.

Add stripCodeBlockWrappers() to both OpenAI and Ollama LLM providers to
remove these wrappers before passing response text to repairJson/JSON.parse.

Fixes ItzCrazyKns#959
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 2 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src/lib/models/providers/openai/openaiLLM.ts">

<violation number="1" location="src/lib/models/providers/openai/openaiLLM.ts:30">
P2: Code-block stripping misses leading whitespace/newline before opening fence, so markdown-wrapped JSON can still fail to parse.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

*/
function stripCodeBlockWrappers(text: string): string {
return text
.replace(/^```(?:json)?\s*\n?/i, '')
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Code-block stripping misses leading whitespace/newline before opening fence, so markdown-wrapped JSON can still fail to parse.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At src/lib/models/providers/openai/openaiLLM.ts, line 30:

<comment>Code-block stripping misses leading whitespace/newline before opening fence, so markdown-wrapped JSON can still fail to parse.</comment>

<file context>
@@ -20,6 +20,17 @@ import {
+ */
+function stripCodeBlockWrappers(text: string): string {
+  return text
+    .replace(/^```(?:json)?\s*\n?/i, '')
+    .replace(/\n?```\s*$/i, '');
+}
</file context>
Suggested change
.replace(/^```(?:json)?\s*\n?/i, '')
.replace(/^\s*```(?:json)?\s*\n?/i, '')
Fix with Cubic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] v1.12.0: JSON parse error with Claude models via OpenAI-compatible endpoints (LiteLLM/OpenRouter)

1 participant