-
Notifications
You must be signed in to change notification settings - Fork 9
Description
Summary
Multiple bugs in response and error handling cause 500 Internal Server Errors when they should return proper error responses.
Problem Description
LM-Proxy has several bugs that cause incorrect HTTP status codes and unhelpful error messages:
Bug #1: Non-Streaming Response Reconstruction Failure
What happens:
When a non-streaming request is made, the proxy receives a valid JSON response from the upstream provider. However, instead of forwarding this response as-is, the code attempts to reconstruct it.
The buggy code:
return JSONResponse(
{
"choices": [
{
"index": 0,
"message": {"role": "assistant", "content": str(out)}, # BUG: Converts response to string
"finish_reason": "stop",
}
]
}
)Why this fails:
- The
outvariable contains a microcoreLLMResponseobject - Doing
str(out)converts the entire response object to a string - This creates malformed JSON or crashes when microcore internally fails to parse
Error message:
AttributeError: 'str' object has no attribute 'choices'
Bug #2: Invalid Upstream Responses Cause 500 Instead of 502
What happens:
When an upstream provider returns a non-JSON response (HTML error page) or a 503 error, the proxy crashes with 500 Internal Server Error.
Example scenarios:
- Provider returns HTML instead of JSON:
Content-Type: text/html - Provider returns 503 Service Unavailable
- Provider returns rate limit errors
Before fix:
{"error": "Internal Server Error"} // Unhelpful 500 errorExpected:
{"error": {"message": "Upstream provider error...", "type": "upstream_error"}} // Clear 502 errorBug #3: Poor Error Logging
What happens:
When errors occur, logs don't provide enough information for debugging.
Current behavior:
DEBUG: Could not read JSON from response data...
INFO: 127.0.0.1 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
Expected behavior:
ERROR: Upstream provider returned non-JSON response. Content-Type was likely 'text/html'...
INFO: 127.0.0.1 - "POST /v1/chat/completions HTTP/1.1" 502 Bad Gateway
Bug #4: Microcore dict() Serialization Edge Case
What happens:
Microcore's LLMResponse.dict() method sometimes returns malformed data.
Error message:
ValueError: dictionary update sequence element #0 has length 1; 2 is required
Expected Solution
For Bug #1 - Forward Response As-Is
# Forward the response as-is if it's a dict/model, otherwise reconstruct
if hasattr(out, "model_dump"):
return JSONResponse(out.model_dump())
elif hasattr(out, "dict"):
return JSONResponse(out.dict())
elif isinstance(out, dict):
return JSONResponse(out)
else:
# Fallback: reconstruct response from string
return JSONResponse({...})For Bug #2 - Graceful Error Handling
try:
out = await async_llm_func(request.messages, **llm_params)
except (AttributeError, ValueError) as e:
return JSONResponse(
{"error": {"message": f"Upstream provider error: {e}", "type": "upstream_error"}},
status_code=502
)For Bug #3 - Better Error Logging
logging.error(
"Upstream provider returned non-JSON response. "
"Content-Type was likely 'text/html' instead of 'application/json'. "
"This is a provider issue, not a proxy issue."
)For Bug #4 - Safe Serialization
try:
dict_out = out.dict()
if isinstance(dict_out, dict):
return JSONResponse(dict_out)
except (ValueError, TypeError, AttributeError) as e:
logging.debug("Failed to serialize out.dict(): %s", e)Requirements
- Non-streaming responses should be forwarded as-is from upstream
- Invalid upstream responses should return 502 (not 500)
- Error messages should be descriptive and actionable
- All edge cases must have tests
Additional Context
These bugs were discovered when testing with providers that return non-standard responses. The fixes ensure LM-Proxy handles edge cases gracefully.