Skip to content

fix: restrict transformers version to be less than 4.57.7#258

Open
Azure99 wants to merge 1 commit intorunpod-workers:mainfrom
Azure99:patch-1
Open

fix: restrict transformers version to be less than 4.57.7#258
Azure99 wants to merge 1 commit intorunpod-workers:mainfrom
Azure99:patch-1

Conversation

@Azure99
Copy link

@Azure99 Azure99 commented Feb 3, 2026

When building/running this worker, transformers is currently not constrained (or effectively >=4.57.0 without an upper bound), so pip may install transformers 5.x. The image pins vLLM to an older version (e.g. vllm==0.11.0 on main), and the combination is incompatible.

At runtime, vLLM fails during tokenizer initialization with:
AttributeError: TokenizersBackend has no attribute all_special_tokens_extended

This prevents the worker from starting at all.

Copilot AI review requested due to automatic review settings February 3, 2026 15:44
@Azure99
Copy link
Author

Azure99 commented Feb 3, 2026

@TimPietrusky could you please take a quick look? New serverless vLLM worker deployments are failing to start due to transformers resolving to 5.x and breaking compatibility with pinned vLLM.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR constrains the transformers dependency to avoid installing incompatible versions alongside the image’s pinned vllm==0.11.0, preventing worker startup failures during tokenizer initialization.

Changes:

  • Add an upper bound to the transformers requirement (<4.57.7) in the builder dependency list.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

pydantic-settings
hf-transfer
transformers>=4.57.0
transformers>=4.57.0,<4.57.7
Copy link

Copilot AI Feb 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR description frames the problem as pip potentially installing transformers 5.x (incompatible with pinned vllm==0.11.0), but the change pins transformers to <4.57.7, which is much narrower than needed to avoid 5.x and will block newer 4.x patch/minor releases. If the intent is only to prevent 5.x, consider using an upper bound like <5 instead; if 4.57.7+ is specifically known to break with vLLM 0.11.0, please update the PR description (or add a short comment in requirements) to document why that exact cutoff is required.

Copilot uses AI. Check for mistakes.
pydantic-settings
hf-transfer
transformers>=4.57.0
transformers>=4.57.0,<4.57.7
Copy link

Copilot AI Feb 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The dependency transformers>=4.57.0,<4.57.7 in a build/runtime requirements file allows pip to automatically pull new, unreviewed 4.57.x releases within this range, which is a classic software supply-chain attack vector. If an upstream transformers release in this range is compromised, its code will execute in the builder/worker environment with access to any available secrets and the ability to affect builds or runtime behavior. To reduce this risk, pin transformers to a specific, vetted version (and ideally include hashes or vendor it) so that upgrades only occur through explicit review.

Copilot uses AI. Check for mistakes.
@ecalifornica
Copy link

Bump @TimPietruskyRunPod

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants