Skip to content

[Chore] Import BaseRenderer From Package#1441

Closed
alex-jw-brooks wants to merge 1 commit intovllm-project:mainfrom
alex-jw-brooks:fix_rendering_import
Closed

[Chore] Import BaseRenderer From Package#1441
alex-jw-brooks wants to merge 1 commit intovllm-project:mainfrom
alex-jw-brooks:fix_rendering_import

Conversation

@alex-jw-brooks
Copy link
Contributor

The BaseRenderer in vLLM has moved since 0.16.0, which will break the import in chat serving if you're developing on the latest versions of vLLM and vLLM Omni together.

This changes the import to grab the BaseRenderer off the package, since it is compatible with 0.16.0 (ref) and the current code (ref)

Signed-off-by: Alex Brooks <albrooks@redhat.com>
@tzhouam
Copy link
Collaborator

tzhouam commented Feb 24, 2026

This change is included in branch dev/vllm-align

@tzhouam tzhouam self-requested a review February 24, 2026 03:24
@hsliuustc0106
Copy link
Collaborator

@vllm-omni-reviewer

@github-actions
Copy link

🤖 VLLM-Omni PR Review

Code Review

1. Overview

  • Summary: This PR updates the import path for BaseRenderer in serving_chat.py. It shifts from importing directly from the internal protocol module (vllm.renderers.protocol) to importing from the public package interface (vllm.renderers).
  • Assessment: Positive. This is a necessary maintenance fix to ensure compatibility with vLLM versions 0.16.0 and above. It adheres to better dependency management practices by relying on the public export path rather than internal module structures.

2. Code Quality

  • Code Style: The change is clean and follows standard Python import conventions.
  • Potential Bugs: The logic is sound. The primary risk is version compatibility, but the author has verified that this path exists in 0.16.0 and main.
  • Performance: No performance impact.

3. Architecture & Design

  • Integration: This change aligns VLLM-Omni with the refactored structure of the upstream vLLM project.
  • Best Practices: Importing from the package root (from vllm.renderers import BaseRenderer) is generally more robust than importing from sub-modules (like protocol). Library maintainers often treat sub-modules as internal implementation details subject to change, whereas the __init__.py exports represent the public API. This move reduces technical debt regarding upstream dependency tracking.

4. Security & Safety

  • Security: No security implications identified.
  • Resource Management: N/A.

5. Testing & Documentation

  • Test Coverage: While the diff is minimal, verification is crucial.
    • Recommendation: Ensure CI/CD pipelines are running against vLLM 0.16.0 (or a version constraint that satisfies this new import path) to prevent runtime ImportErrors.
  • Documentation: No documentation updates are required for this internal code change, though the PR description clearly explains the motivation.

6. Specific Suggestions

File: vllm_omni/entrypoints/openai/serving_chat.py:16

  • Dependency Constraints:
    Since this change is driven by vLLM 0.16.0 structure changes, please verify that the project's dependency constraints (e.g., in requirements.txt, setup.py, or pyproject.toml) reflect this.

    • If vllm.renderers was introduced/populated in 0.16.0, ensure the minimum version requirement is set correctly (e.g., vllm>=0.16.0).
    • If backward compatibility with older vLLM versions (pre-0.16.0) is still required, this change might break those versions. However, the PR description implies the focus is on supporting the latest versions.
  • Verification:
    Confirm that BaseRenderer is explicitly listed in vllm/renderers/__init__.py for the targeted versions. The links provided in the PR description confirm this for 0.16.0 and main, which is sufficient for a merge assuming the project tracks current vLLM releases.

7. Approval Status

LGTM (Looks Good To Me)

The change is a straightforward fix for upstream API movement and adopts the more stable import pattern. Assuming the dependency version constraints are managed correctly (or if this project tracks the latest vLLM main/releases), this is safe to merge.


This review was generated automatically by the VLLM-Omni PR Reviewer Bot
using glm-5.

@alex-jw-brooks
Copy link
Contributor Author

Thanks @tzhouam - would you prefer to add this separately in case people are actively developing on the server and running into the import error, or to close this to merge in your branch?

@tzhouam
Copy link
Collaborator

tzhouam commented Feb 25, 2026

Does the vllm v0.16.0 tag include this modification? If not, maybe we need to keep the main branch unchanged for stability.
From myside, if you truly need this change, you can develop based on the dev/vllm-align branch, which will catch up with the vllm newest commits in 1-2 days.

@alex-jw-brooks
Copy link
Contributor Author

Hey @tzhouam - yes, this is compatible with 0.16.0 and the current code (see the links in the pr body) so merging it should not cause any issues! Not urgent though, just thought it would be helpful since others are likely running into it and making the fix locally as well

Copy link
Contributor

@lishunyang12 lishunyang12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@alex-jw-brooks
Copy link
Contributor Author

Closing, #1566 was merged separately

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants