feat(openai): add chat/completions api_format config option#902
Open
aimbit-ni wants to merge 3 commits intoprism-php:mainfrom
Open
feat(openai): add chat/completions api_format config option#902aimbit-ni wants to merge 3 commits intoprism-php:mainfrom
aimbit-ni wants to merge 3 commits intoprism-php:mainfrom
Conversation
StreamEndEvent.usage can be null when providers don't include usage data in their final stream chunk, causing a TypeError downstream. Add `?? new Usage(0, 0)` fallback to emitStreamEndEvent() in all providers missing it, matching the existing pattern in the OpenAI stream handler.
Add an `api_format` config option to the OpenAI driver that allows switching from the default `/responses` endpoint to `/chat/completions`. This enables using Prism with OpenAI-compatible backends like vLLM, LiteLLM, and LocalAI that only implement the chat/completions API. Set `OPENAI_API_FORMAT=chat_completions` in your env to use it. Only text, structured, and stream methods dispatch conditionally — other modalities (embeddings, images, moderation, TTS, STT) already use standard endpoints that work with compatible backends as-is.
…nfigured Providers that reject unknown parameters (e.g. Perplexity via LiteLLM) return HTTP 400 when `"tools": []` is sent. Return null instead so Arr::whereNotNull() filters it out entirely.
This was referenced Feb 12, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
api_formatconfig option to the OpenAI driver (default:responses, alternative:chat_completions)/chat/completionstext(),structured(), andstream()dispatch conditionally — other modalities (embeddings, images, moderation, TTS, STT) already use standard endpoints that work as-isUsage
New files
Handlers/ChatCompletions/Text.php,Structured.php,Stream.php— chat/completions protocol handlersMaps/ChatCompletionsMessageMap.php— standard message format (texttype, flat assistant content)Maps/ChatCompletionsFinishReasonMap.php— mapsstop/tool_calls/lengthMaps/ChatCompletionsToolMap.php,ChatCompletionsToolChoiceMap.php— nestedfunctionkey formatMaps/ChatCompletionsImageMapper.php—image_urlformatRefs #900
Test plan
OPENAI_API_FORMAT=chat_completionswith a LiteLLM/vLLM proxy