Skip to content

Commit fe1d965

Browse files
authored
Link to new guides (#385)
* Link to new guides * Fix guide title
1 parent 61904f4 commit fe1d965

File tree

6 files changed

+129
-234
lines changed

6 files changed

+129
-234
lines changed

docs/hub/_sections.yml

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,16 @@
2020
title: Spaces documentation (beta)
2121

2222
- local: adding-a-library
23-
title: Integrating a library to the Hub
23+
title: Integrate a library with the Hub
24+
25+
- local: how-to-downstream
26+
title: How to download files from the Hub
27+
28+
- local: how-to-upstream
29+
title: How to create repositories and upload files to the Hub
30+
31+
- local: how-to-inference
32+
title: How to set up Inference API with your library
2433

2534
- local: adding-a-task
2635
title: Adding a new task to the Hub

docs/hub/adding-a-library.md

Lines changed: 103 additions & 233 deletions
Large diffs are not rendered by default.

docs/hub/how-to-downstream.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
title: How to download files from the Hub
3+
---
4+
15
# How to integrate downstream utilities in your library
26

37
Utilities that allow your library to download files from the Hub are referred to as *downstream* utilities. This guide introduces additional downstream utilities you can integrate with your library, or use separately on their own. You will learn how to:

docs/hub/how-to-inference.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
title: How to set up Inference API with your library
3+
---
4+
15
# How to integrate the Inference API in your library
26

37
The Inference API provides fast inference for your hosted models. The Inference API can be accessed via usual HTTP requests with your favorite programming languages, but the `huggingface_hub` library has a client wrapper to access the Inference API programmatically. This guide will show you how to make calls to the Inference API from your library. For more detailed information, refer to the [Inference API documentation](https://api-inference.huggingface.co/docs/python/html/index.html).

docs/hub/how-to-upstream.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
title: How to create repositories and upload files to the Hub
3+
---
4+
15
# How to integrate upstream utilities in your library
26

37
*Upstream* utilities allow you to publish files to the Hub from your library. This guide will show you how to:

docs/hub/tutorial-add-library.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
---
2+
title: Integrate a library with the Hub
3+
---
4+
15
# Integrate your library with the Hub
26

37
The Hugging Face Hub aims to facilitate the sharing of machine learning models, checkpoints, and artifacts. This endeavor includes integrating the Hub into many of the amazing third-party libraries in the community. Some of the ones already integrated include [spaCy](https://spacy.io/usage/projects#huggingface_hub), [AllenNLP](https://allennlp.org/), and [timm](https://rwightman.github.io/pytorch-image-models/), among many others. Integration means users can download and upload files to the Hub directly from your library. We hope you will integrate your library and join us in democratizing artificial intelligence for everyone!

0 commit comments

Comments
 (0)