-
Notifications
You must be signed in to change notification settings - Fork 373
Update model-release-checklist.md #1671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -7,9 +7,12 @@ The Hugging Face Hub is the go-to platform for sharing machine learning models. | |
| ### Uploading weights | ||
|
|
||
| When uploading models to the hub, it's recommended to follow a set of best practices: | ||
| ### Uploading Weights | ||
|
|
||
| - **Use separate repositories for different model weights.** For example, you can store quantization variants for the same model in a single repository, but use separate repositories for different model weights. | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this goes a bit against what we preach no, each quant, precision, weight type should go in a new repo? (the only exception is
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Mhh no we want a single repo with all the quants it really makes more sense imo. cc @julien-c for final decision
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. hmm i see most GGUF repos put different quants in the same repo, yes (and we have nice UI feature for this) I'm not sure we ever advocated for different quants of the same model to be in ≠ repos actually no? We did push for ≠ repos for GGUF vs. other formats (Pytorch, etc)
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Not sure, I'd consider GGUFs as an exception here and more specifically GGUFs produced by model quantisers. For everything else we recommend having one quant per repo (look at MLX community: https://huggingface.co/mlx-community for example) or MLC (this is also the reason why MLX-my-repo, gguf-my-repo they all create one quant per repo only) Even for model releases we try to have one quant per repo:
There's a lot more examples for this too across model releases. Something we actively want to discourage is mixed repos - various quants and pytorch models there and the lines can be quite blurry with it.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yep. that being said i think the wording in this PR is fine
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. okay! don't want to be a blocker here! - we can always reword if people get confused! |
||
|
|
||
| - **Prefer [`safetensors`](https://huggingface.co/docs/safetensors/en/index) over `pickle` for weight serialization.** `safetensors` offers improved safety and performance compared to Python's `pickle`. | ||
|
|
||
| - push weights to separate model repositories. Example: prefer uploading individual quantizations/precisions in a standalone repo like [this](https://huggingface.co/jameslahm/yolov10n) over all types/versions in one like [this](https://huggingface.co/kadirnar/Yolov10/tree/main). | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. not sure this is true (or I misunderstood) tried to simplify |
||
| - leverage [safetensors](https://huggingface.co/docs/safetensors/en/index) for weights serialization as opposed to pickle. | ||
|
|
||
| ### Writing a Comprehensive Model Card | ||
|
|
||
|
|
||
Uh oh!
There was an error while loading. Please reload this page.