-
Notifications
You must be signed in to change notification settings - Fork 53
fix: some minor fixes #223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
+157
−60
Merged
Changes from 1 commit
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
e0ddfe2
fix: enforce minimum vllm version
jakelorocco 70bac6a
fix: remove tests that look for "β"
jakelorocco b0bd8ee
fix: remove default response_format from litellm and openai backends
jakelorocco f2c067c
fix: remove xdist from pytests
jakelorocco e521377
fix: fix vllm tests
jakelorocco 7d27532
fix: vllm async event loop
jakelorocco 70d92f1
feat: add contexts to validation results
jakelorocco fa4c4c6
fix: add warning for mps with huggingface generate from raw
jakelorocco b8582e6
fix: remove .py from folder name
jakelorocco 46bc909
fix: remove pytest-xdist specific args
jakelorocco 692b395
fix: add exception with vllm backend when env var not set
jakelorocco File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optional:
assert el == self._underlying_model.background_loop?cf. https://docs.vllm.ai/en/v0.6.3/_modules/vllm/engine/async_llm_engine.html#AsyncLLMEngine.start_background_loop
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They call that a
background_loopbut it's not an event loop, it's actually a Future. Even the_background_loop_unshieldedis a Task object.I think it's fine to manage the reference to the event loop on our side. We only ever have the one
AsyncLLMEngineper LocalVLLMBackend so there shouldn't be issues with us tracking it this way. Happy to change it if it causes issues later on.