Skip to content

Conversation

@shaohuzhang1
Copy link
Contributor

fix: Rearrange model dependency

@f2c-ci-robot
Copy link

f2c-ci-robot bot commented Nov 6, 2025

Adding the "do-not-merge/release-note-label-needed" label because no release-note block was detected, please follow our release note process to remove it.

Details

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@f2c-ci-robot
Copy link

f2c-ci-robot bot commented Nov 6, 2025

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@shaohuzhang1 shaohuzhang1 merged commit 1b464c4 into v2 Nov 6, 2025
3 of 5 checks passed
@shaohuzhang1 shaohuzhang1 deleted the pr@v2@fix_reranker branch November 6, 2025 06:24
from transformers import AutoModelForSequenceClassification, AutoTokenizer
self.model = model_name
self.cache_dir = cache_dir
self.model_kwargs = model_kwargs
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The provided code checks out with no immediate errors or issues related to syntax, logic, or imports. The initialization of AutoModelForSequenceClassification and AutoTokenizer is correctly placed after the necessary libraries have been imported.

Optimization Suggestion: Ensure that the paths used in file operations (like reading data) are correct relative to where the script is being run, as this can lead to FileNotFoundError.

Additionally, consider handling exceptions such as when attempting to load models to avoid silent failures:

from transformers import AutoModelForSequenceClassification, AutoTokenizer

class LocalReranker(MaxKBBaseModel, BaseDocumentCompressor):
    __name__ = "LocalReranker"

    def __init__(self, model_name : str, cache_dir: Optional[str] = None, **model_kwargs):
        super().__init__()
        
        try:
            from transformers import AutoModelForSequenceClassification, AutoTokenizer
        except ImportError as e:
            raise SystemError(f"Failed to import required transformers library(s): {str(e)}")

        self.model_name = model_name
        self.cache_dir = cache_dir if cache_dir else "/path/to/default/cache/dir"
        self.model_kwargs = model_kwargs
        
        # Initialize or load the model here

This modification adds basic error checking for missing dependencies and specifies a default path for caching models. Adjust the path according to how you set up your environment.

liuruibin pushed a commit that referenced this pull request Nov 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants