Skip to content

Update dependency transformers to v4.57.6#158

Open
konflux-internal-p02[bot] wants to merge 1 commit intorhoai-3.3from
konflux/mintmaker/rhoai-3.3/transformers-4.x
Open

Update dependency transformers to v4.57.6#158
konflux-internal-p02[bot] wants to merge 1 commit intorhoai-3.3from
konflux/mintmaker/rhoai-3.3/transformers-4.x

Conversation

@konflux-internal-p02
Copy link

@konflux-internal-p02 konflux-internal-p02 bot commented Dec 24, 2025

This PR contains the following updates:

Package Change Age Confidence
transformers ==4.57.1 -> ==4.57.6 age confidence

Release Notes

huggingface/transformers (transformers)

v4.57.6: Patch release v4.57.6

Compare Source

What's Changed

Another fix for qwen vl models that prevented correctly loading the associated model type - this works together with #​41808 of the previous patch release.

  • Fixed incorrect model_type for qwen2vl and qwen2.5vl when config is saved and loaded again by @​i3hz in #​41758

Full Changelog: huggingface/transformers@v4.57.5...v4.57.6

v4.57.5: Patch release v4.57.5

Compare Source

What's Changed

Should not have said last patch 😉 These should be the last remaining fixes that got lost in between patches and the transition to v5.

Full Changelog: huggingface/transformers@v4.57.4...v4.57.5

v4.57.4: Patch release v4.57.4

Compare Source

What's Changed

Last patch release for v4: We have a few small fixes for remote generation methods (e.g. group beam search), vLLM, and an offline tokenizer fix (if it's already been cached).

New Contributors

Full Changelog: huggingface/transformers@v4.57.3...v4.57.4

v4.57.3: Patch release v4.57.3

Compare Source

There was a hidden bug when loading models with local_files_only=True and a typo related to the recent patch.

The main fix is: b605555.

We are really sorry that this slipped through, our CIs just did not catch it.

As it affects a lot of users we are gonna yank the previous release

v4.57.2: Patch Release v4.57.2

Compare Source

This patch most notably fixes an issue on some Mistral tokenizers. It contains the following commits:

  • Add AutoTokenizer mapping for mistral3 and ministral (#​42198)
  • Auto convert tekken.json (#​42299)
  • fix tekken pattern matching (#​42363)
  • Check model inputs - hidden states (#​40994)
  • Remove invalid @staticmethod from module-level get_device_and_memory_breakdown (#​41747)

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

To execute skipped test pipelines write comment /ok-to-test.


Documentation

Find out how to configure dependency updates in MintMaker documentation or see all available configuration options in Renovate documentation.

@konflux-internal-p02 konflux-internal-p02 bot force-pushed the konflux/mintmaker/rhoai-3.3/transformers-4.x branch from 4760fd6 to cb7613e Compare January 13, 2026 16:34
@konflux-internal-p02 konflux-internal-p02 bot changed the title Update dependency transformers to v4.57.3 Update dependency transformers to v4.57.4 Jan 13, 2026
@konflux-internal-p02 konflux-internal-p02 bot force-pushed the konflux/mintmaker/rhoai-3.3/transformers-4.x branch from cb7613e to d7b749e Compare January 13, 2026 20:39
@konflux-internal-p02 konflux-internal-p02 bot changed the title Update dependency transformers to v4.57.4 Update dependency transformers to v4.57.5 Jan 13, 2026
Signed-off-by: konflux-internal-p02 <170854209+konflux-internal-p02[bot]@users.noreply.github.com>
@konflux-internal-p02 konflux-internal-p02 bot force-pushed the konflux/mintmaker/rhoai-3.3/transformers-4.x branch from d7b749e to 2fb4c81 Compare January 16, 2026 16:33
@konflux-internal-p02 konflux-internal-p02 bot changed the title Update dependency transformers to v4.57.5 Update dependency transformers to v4.57.6 Jan 16, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants