Skip to content

Commit dbf3ce0

Browse files
julien-cSBrandeisVaibhavs10Pierrci
authored
storage-limits (#1515)
* storage-limits * mini tweak * Additional links + details (#1516) * add some details about billing of payg storage * Apply suggestions from code review Co-authored-by: Julien Chaumond <[email protected]> --------- Co-authored-by: Julien Chaumond <[email protected]> * up - grant private repo * Update docs/hub/storage-limits.md Co-authored-by: Pierric Cistac <[email protected]> * Update docs/hub/storage-limits.md * Update docs/hub/billing.md Co-authored-by: vb <[email protected]> * Update docs/hub/storage-limits.md * up - clarify best-effort. --------- Co-authored-by: Simon Brandeis <[email protected]> Co-authored-by: Vaibhav Srivastav <[email protected]> Co-authored-by: Pierric Cistac <[email protected]>
1 parent 93f21b6 commit dbf3ce0

File tree

8 files changed

+59
-14
lines changed

8 files changed

+59
-14
lines changed

docs/hub/_redirects.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,3 +17,4 @@ searching-the-hub: /docs/huggingface_hub/searching-the-hub
1717
api-webhook: webhooks
1818
adapter-transformers: adapters
1919
security-two-fa: security-2fa
20+
repositories-recommendations: storage-limits

docs/hub/_toctree.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@
2525
title: "How-to: Create automatic metadata quality reports"
2626
- local: notebooks
2727
title: Notebooks
28-
- local: repositories-recommendations
29-
title: Repository size recommendations
28+
- local: storage-limits
29+
title: Storage Limits
3030
- local: repositories-next-steps
3131
title: Next Steps
3232
- local: repositories-licenses

docs/hub/billing.md

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Billing
22

3-
At Hugging Face, we build a collaboration platform for the ML community (i.e., the Hub) and monetize by providing simple access to compute for AI.
3+
At Hugging Face, we build a collaboration platform for the ML community (i.e., the Hub) and monetize by providing advanced features and simple access to compute for AI.
44

55
Any feedback or support request related to billing is welcome at [email protected]
66

@@ -61,13 +61,14 @@ You can view invoices and receipts for the last 3 months in your billing dashboa
6161

6262
## Enterprise Hub subscriptions
6363

64-
We offer advanced security and compliance features for organizations through our Enterprise Hub subscription, including [Single Sign-On](./enterprise-sso.md), [Advanced Access Control](./enterprise-hub-resource-groups.md) for repositories, control over your data location, and more.
64+
We offer advanced security and compliance features for organizations through our Enterprise Hub subscription, including [Single Sign-On](./enterprise-sso.md), [Advanced Access Control](./enterprise-hub-resource-groups.md) for repositories, control over your data location, higher [storage capacity](./storage-limits.md) for private repositories, and more.
6565

6666
The Enterprise Hub is billed like a typical subscription. It renews automatically, but you can choose to cancel it at any time in the organization's billing settings.
6767

6868
You can pay for the Enterprise Hub subscription with a credit card or your AWS account.
6969

7070
Upon renewal, the number of seats in your Enterprise Hub subscription will be updated to match the number of members of your organization.
71+
Private repository storage above the [included storage](./storage-limits.md) will be billed along with your subscription renewal.
7172

7273

7374
<div class="flex justify-center">
@@ -80,6 +81,7 @@ Upon renewal, the number of seats in your Enterprise Hub subscription will be up
8081
The PRO subscription unlocks additional features for users, including:
8182

8283
- Higher free tier for the Serverless Inference API and when consuming ZeroGPU Spaces
84+
- Higher [storage capacity](./storage-limits.md) for private repositories
8385
- Ability to create ZeroGPU Spaces and use Dev Mode
8486
- Ability to write Social Posts and Community Blogs
8587
- Leverage the Dataset Viewer on private datasets
@@ -89,5 +91,10 @@ View the full list of benefits at https://huggingface.co/subscribe/pro
8991
Similarly to the Enterprise Hub subscription, PRO subscriptions are billed like a typical subscription. The subscription renews automatically for you. You can choose to cancel the subscription at anytime in your billing settings: https://huggingface.co/settings/billing
9092

9193
You can only pay for the PRO subscription with a credit card. The subscription is billed separately from any pay-as-you-go compute usage.
94+
Private repository storage above the [included storage](./storage-limits.md) will be billed along with your subscription renewal.
9295

9396
Note: PRO benefits are also included in the Enterprise Hub subscription.
97+
98+
## Pay-as-you-go private storage
99+
100+
Above the included 1TB (or 1TB per seat) of private storage in PRO and Enterprise Hub, private storage is invoiced at **$25/TB/month**, in 1TB increments.

docs/hub/datasets-adding.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -111,4 +111,4 @@ The Hugging Face Hub supports large scale datasets, usually uploaded in Parquet
111111

112112
You can upload large scale datasets at high speed using the `huggingface_hub` library.
113113

114-
See [how to upload a folder by chunks](/docs/huggingface_hub/guides/upload#upload-a-folder-by-chunks), the [tips and tricks for large uploads](/docs/huggingface_hub/guides/upload#tips-and-tricks-for-large-uploads) and the [repository limitations and recommendations](./repositories-recommendations).
114+
See [how to upload a folder by chunks](/docs/huggingface_hub/guides/upload#upload-a-folder-by-chunks), the [tips and tricks for large uploads](/docs/huggingface_hub/guides/upload#tips-and-tricks-for-large-uploads) and the [repository storage limits and recommendations](./storage-limits).

docs/hub/enterprise-hub.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,3 +23,5 @@ In this section we will document the following Enterprise Hub features:
2323
- [Tokens Management](./enterprise-hub-tokens-management)
2424
- [Analytics](./enterprise-hub-analytics)
2525
- [Network Security](./enterprise-hub-network-security)
26+
27+
Finally, Enterprise Hub includes 1TB of [private repository storage](./storage-limits) per seat in the subscription, i.e. if your organization has 40 members, then you have 40TB included storage for your private models and datasets.

docs/hub/other.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
- [Managing Organizations](./organizations-managing)
77
- [Organization Cards](./organizations-cards)
88
- [Access control in organizations](./organizations-security)
9+
- [Enterprise Hub](./enterprise-hub)
910
- [Moderation](./moderation)
1011
- [Billing](./billing)
1112
- [Digital Object Identifier (DOI)](./doi)

docs/hub/repositories.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,6 @@ In these pages, you will go over the basics of getting started with Git and inte
1313
- [Webhooks](./webhooks)
1414
- [Notifications](./notifications)
1515
- [Collections](./collections)
16-
- [Repository size recommendations](./repositories-recommendations)
16+
- [Repository storage limits](./storage-limits)
1717
- [Next Steps](./repositories-next-steps)
1818
- [Licenses](./repositories-licenses)

docs/hub/repositories-recommendations.md renamed to docs/hub/storage-limits.md

Lines changed: 42 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,33 @@
1-
# Repository limitations and recommendations
1+
# Storage limits
22

3-
There are some limitations to be aware of when dealing with a large amount of data in your repo. Given the time it takes to stream the data,
4-
getting an upload/push to fail at the end of the process or encountering a degraded experience, be it on hf.co or when working locally, can be very annoying.
3+
At Hugging Face our intent is to provide the AI community with **free storage space for public repositories**. We do bill for storage space for **private repositories**, above a free tier (see table below).
54

6-
## Recommendations
5+
We [optimize our infrastructure](https://huggingface.co/blog/xethub-joins-hf) continuously to [scale our storage](https://x.com/julien_c/status/1821540661973160339) for the coming years of growth in Machine learning.
6+
7+
We do have mitigations in place to prevent abuse of free public storage, and in general we ask users and organizations to make sure any uploaded large model or dataset is **as useful to the community as possible** (as represented by numbers of likes or downloads, for instance).
8+
9+
## Storage plans
10+
11+
| Type of account | Public storage | Private storage |
12+
| ---------------- | -------------- | ---------------------------- |
13+
| Free user or org | Best-effort* 🙏| 100GB |
14+
| PRO | Unlimited ✅ | 1TB + pay-as-you-go |
15+
| Enterprise Hub | Unlimited ✅ | 1TB per seat + pay-as-you-go |
16+
17+
💡 Enterprise Hub includes 1TB of private storage per seat in the subscription: for example, if your organization has 40 members, then you have 40TB of included private storage.
18+
19+
*We aim to continue providing the AI community with free storage space for public repositories, please don't abuse and upload dozens of TBs of generated anime. If possible, we still ask that you consider upgrading to PRO and/or Enterprise Hub whenever possible.
20+
21+
### Pay-as-you-go price
22+
23+
Above the included 1TB (or 1TB per seat) of private storage in PRO and Enterprise Hub, private storage is invoiced at **$25/TB/month**, in 1TB increments. See our [billing doc](./billing) for more details.
24+
25+
## Repository limitations and recommendations
26+
27+
In parallel to storage limits at the account (user or organization) level, there are some limitations to be aware of when dealing with a large amount of data in a specific repo. Given the time it takes to stream the data,
28+
getting an upload/push to fail at the end of the process or encountering a degraded experience, be it on hf.co or when working locally, can be very annoying. In the following section, we describe our recommendations on how to best structure your large repos.
29+
30+
### Recommendations
731

832
We gathered a list of tips and recommendations for structuring your repo. If you are looking for more practical tips, check out [this guide](https://huggingface.co/docs/huggingface_hub/main/en/guides/upload#tips-and-tricks-for-large-uploads) on how to upload large amount of data using the Python library.
933

@@ -21,7 +45,7 @@ _* Not relevant when using `git` CLI directly_
2145

2246
Please read the next section to understand better those limits and how to deal with them.
2347

24-
## Explanations
48+
### Explanations
2549

2650
What are we talking about when we say "large uploads", and what are their associated limitations? Large uploads can be
2751
very diverse, from repositories with a few huge files (e.g. model weights) to repositories with thousands of small files
@@ -31,9 +55,9 @@ Under the hood, the Hub uses Git to version the data, which has structural impli
3155
If your repo is crossing some of the numbers mentioned in the previous section, **we strongly encourage you to check out [`git-sizer`](https://github.com/github/git-sizer)**,
3256
which has very detailed documentation about the different factors that will impact your experience. Here is a TL;DR of factors to consider:
3357

34-
- **Repository size**: The total size of the data you're planning to upload. We generally support repositories up to 300GB. If you would like to upload more than 300 GBs (or even TBs) of data, you will need to ask us to grant more storage. To do that, please send an email with details of your project to [email protected].
58+
- **Repository size**: The total size of the data you're planning to upload. We generally support repositories up to 300GB. If you would like to upload more than 300 GBs (or even TBs) of data, you will need to ask us to grant more storage. To do that, please send an email with details of your project to [email protected] (for datasets) or [email protected] (for models).
3559
- **Number of files**:
36-
- For optimal experience, we recommend keeping the total number of files under 100k. Try merging the data into fewer files if you have more.
60+
- For optimal experience, we recommend keeping the total number of files under 100k, and ideally much less. Try merging the data into fewer files if you have more.
3761
For example, json files can be merged into a single jsonl file, or large datasets can be exported as Parquet files or in [WebDataset](https://github.com/webdataset/webdataset) format.
3862
- The maximum number of files per folder cannot exceed 10k files per folder. A simple solution is to
3963
create a repository structure that uses subdirectories. For example, a repo with 1k folders from `000/` to `999/`, each containing at most 1000 files, is already enough.
@@ -57,7 +81,7 @@ happen (in rare cases) that even if the timeout is raised client-side, the proce
5781
completed server-side. This can be checked manually by browsing the repo on the Hub. To prevent this timeout, we recommend
5882
adding around 50-100 files per commit.
5983

60-
## Sharing large datasets on the Hub
84+
### Sharing large datasets on the Hub
6185

6286
One key way Hugging Face supports the machine learning ecosystem is by hosting datasets on the Hub, including very large ones. However, if your dataset is bigger than 300GB, you will need to ask us to grant more storage.
6387

@@ -78,3 +102,13 @@ For hosting large datasets on the Hub, we require the following for your dataset
78102
- Avoid the use of custom loading scripts when using datasets. In our experience, datasets that require custom code to use often end up with limited reuse.
79103

80104
Please get in touch with us if any of these requirements are difficult for you to meet because of the type of data or domain you are working in.
105+
106+
### Sharing large volumes of models on the Hub
107+
108+
Similarly to datasets, if you host models bigger than 300GB or if you plan on uploading a large number of smaller sized models (for instance, hundreds of automated quants) totalling more than 1TB, you will need to ask us to grant more storage.
109+
110+
To do that, to ensure we can effectively support the open-source ecosystem, please send an email with details of your project to [email protected].
111+
112+
### Grants for private repositories
113+
114+
If you need more model/ dataset storage than your allocated private storage for academic/ research purposes, please reach out to us at [email protected] or [email protected] along with a proposal of how you will use the storage grant.

0 commit comments

Comments
 (0)