-
Couldn't load subscription status.
- Fork 209
Vbert #339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vbert #339
Conversation
e77142f to
5c11cd3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly comments about the form, overall LGTM!
| ColQwen2_5Omni, | ||
| ColQwen2_5OmniProcessor, | ||
| # ColQwen2_5Omni, | ||
| # ColQwen2_5OmniProcessor, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add comment to the README if ColQwen 2.5 Omni is not supported anymore
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could rename variables for more clarity and use constants, and add doc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
save as test_bi_losses
This comment was marked as resolved.
This comment was marked as resolved.
* modeling * update modeling * update token id default * init files * remove vllama + update torch lower bound for cpu * back to normal transformer bound * clean * Update colpali_engine/models/__init__.py --------- Co-authored-by: QuentinJGMace <[email protected]>
Still to do:
Modify all negatives loss, not just the ones used for vbert