[New Model] support new model ovis2.6#34426
[New Model] support new model ovis2.6#34426myselvess wants to merge 2 commits intovllm-project:mainfrom
Conversation
Signed-off-by: myselvess <[email protected]>
|
Documentation preview: https://vllm--34426.org.readthedocs.build/en/34426/ |
There was a problem hiding this comment.
Code Review
This pull request introduces support for the new ovis2.6 model by adapting and reusing the existing ovis2.5 implementation. It also includes two bug fixes: one in Ovis2_5MultiModalProcessor to correctly handle prompt_token_ids, and another in Siglip2VisionTransformer where an unnecessary post_layernorm is removed. The changes are well-structured. My review includes one suggestion to replace a magic number with a constant-derived expression for better code clarity and maintainability.
Signed-off-by: myselvess <[email protected]>
Isotr0py
left a comment
There was a problem hiding this comment.
Hmmm, I'm a bit worried that these changes to support Ovis2.6 will also break existing Ovis2.5 support...
| "Ovis2_6ForCausalLM": ("ovis2_5", "Ovis2_5"), | ||
| "Ovis2_6_MoeForCausalLM": ("ovis2_5", "Ovis2_5"), |
There was a problem hiding this comment.
Can you also update tests/models/registry.py? If Ovis-2.6-2B is unavailable yet, we can set is_available_online=False for it.
| vte_vocab_size - len(INDICATOR_IDS) + abs(x + 300) - 1 | ||
| vte_vocab_size - len(INDICATOR_IDS) + (x - INDICATOR_IDS[0]) | ||
| for x in visual_indicators | ||
| if x < -300 |
There was a problem hiding this comment.
Will Ovis2.5's tokenizer also be updated? I'm a bit worried that this change will break Ovis2.5 support. Especially Ovis2.5 has various backbones with different tokenizer.
Purpose
ps: Ovis2.6-2B is coming soon
Test Plan
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.