-
Notifications
You must be signed in to change notification settings - Fork 180
fix: enhance llm-katan OpenAI API compatibility for issue #241 #354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Xunzhuo
merged 7 commits into
vllm-project:main
from
yossiovadia:fix/llm-katan-openai-compatibility-241
Oct 7, 2025
Merged
fix: enhance llm-katan OpenAI API compatibility for issue #241 #354
Xunzhuo
merged 7 commits into
vllm-project:main
from
yossiovadia:fix/llm-katan-openai-compatibility-241
Oct 7, 2025
+49
−5
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…t#241 - Add missing OpenAI API response fields (system_fingerprint, logprobs, detailed usage) - Fix streaming response Content-Type from text/plain to text/event-stream - Ensure both static and streaming responses include all compatibility fields - Add token_usage alias for better SDK compatibility - Apply fixes to both TransformersBackend and VLLMBackend Resolves OpenWebUI hanging issue when connecting to llm-katan endpoints. Signed-off-by: Yossi Ovadia <[email protected]>
Published llm-katan v0.1.9 to PyPI with OpenAI API compatibility fixes. Signed-off-by: Yossi Ovadia <[email protected]>
✅ Deploy Preview for vllm-semantic-router ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
rootfs
previously approved these changes
Oct 6, 2025
@yossiovadia can you run pre-commit? |
Trigger CI re-run to verify if Black formatting issues are resolved. Signed-off-by: Yossi Ovadia <[email protected]>
Signed-off-by: Yossi Ovadia <[email protected]>
Signed-off-by: Yossi Ovadia <[email protected]>
👥 vLLM Semantic Team NotificationThe following members have been identified for the changed files in this PR and have been automatically assigned: 📁
|
Xunzhuo
approved these changes
Oct 7, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Fixes llm-katan OpenAI API compatibility issues causing OpenWebUI to hang when connecting to llm-katan endpoints.
Changes Made
system_fingerprint
,logprobs
, detailedusage
objecttext/plain
totext/event-stream
token_usage
alias: For better SDK compatibilityProblem Solved
Testing
Fixes #241