Skip to content

fix async continuous batching #429

Merged
aniketmaurya merged 4 commits intomainfrom
aniket/fix-vllm
Feb 18, 2025
Merged

fix async continuous batching #429
aniketmaurya merged 4 commits intomainfrom
aniket/fix-vllm

Conversation

@aniketmaurya
Copy link
Copy Markdown
Collaborator

@aniketmaurya aniketmaurya commented Feb 18, 2025

What does this PR do?

Libraries like vLLM sometimes return an empty list in the step call. This PR handles that.

Before submitting
  • Was this discussed/agreed via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in GitHub issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link
Copy Markdown

codecov bot commented Feb 18, 2025

Codecov Report

Attention: Patch coverage is 0% with 3 lines in your changes missing coverage. Please review.

Project coverage is 90%. Comparing base (49cedbd) to head (332c89f).
Report is 2 commits behind head on main.

Additional details and impacted files
@@         Coverage Diff         @@
##           main   #429   +/-   ##
===================================
  Coverage    89%    90%           
===================================
  Files        31     31           
  Lines      2025   2026    +1     
===================================
+ Hits       1806   1817   +11     
+ Misses      219    209   -10     

@aniketmaurya aniketmaurya changed the title [wip] fix async continuous batching fix async continuous batching Feb 18, 2025
@ali-alshaar7
Copy link
Copy Markdown
Contributor

should we reorder this one too

self.response_queue_ids[uid] = response_queue_id
?

@aniketmaurya aniketmaurya merged commit 62b1243 into main Feb 18, 2025
21 checks passed
@aniketmaurya aniketmaurya deleted the aniket/fix-vllm branch February 18, 2025 10:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants