Skip to content

Commit 6ea0570

Browse files
SBrandeisWauplinhanouticelina
committed
Changes from code review
Co-authored-by: Wauplin <[email protected]> Co-authored-by: hanouticelina <[email protected]>
1 parent d2e0978 commit 6ea0570

File tree

1 file changed

+9
-11
lines changed

1 file changed

+9
-11
lines changed

inference-providers-feeatherless.md

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: "Featherless on Hugging Face Inference Providers 🔥"
2+
title: "Featherless AI on Hugging Face Inference Providers 🔥"
33
thumbnail: /blog/assets/inference-providers-featherless/thumbnail.png
44
authors:
55
- user: wxgeorge
@@ -20,18 +20,18 @@ authors:
2020
<!-- ![banner image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers/featherless-banner.png) -->
2121
<!-- TODO: add a banner -->
2222

23-
# Featherless on Hugging Face Inference Providers 🔥
23+
# Featherless AI on Hugging Face Inference Providers 🔥
2424

25-
We're thrilled to share that **Featherless** is now a supported Inference Provider on the Hugging Face Hub!
26-
Featherless joins our growing ecosystem, enhancing the breadth and capabilities of serverless inference directly on the Hub’s model pages. Inference Providers are also seamlessly integrated into our client SDKs (for both JS and Python), making it super easy to use a wide variety of models with your preferred providers.
25+
We're thrilled to share that **Featherless AI** is now a supported Inference Provider on the Hugging Face Hub!
26+
Featherless AI joins our growing ecosystem, enhancing the breadth and capabilities of serverless inference directly on the Hub’s model pages. Inference Providers are also seamlessly integrated into our client SDKs (for both JS and Python), making it super easy to use a wide variety of models with your preferred providers.
2727

28-
[Featherless](https://featherless.ai) supports a wide variety of text and conversational models, including the latest open-source models from DeepSeek, Meta, Google, Qwen, and much more.
28+
[Featherless AI](https://featherless.ai) supports a wide variety of text and conversational models, including the latest open-source models from DeepSeek, Meta, Google, Qwen, and much more.
2929

30-
Find the full list of supported models on the [models page](https://huggingface.co/models?inference_provider=featherless-ai&sort=trending).
30+
Featherless AI is a serverless AI inference provider with unique model loading and GPU orchestration abilities that makes an exceptionally large catalog of models available for users. Providers often offer either a low cost of access to a limited set of models, or an unlimited range of models with users managing servers and the associated costs of operation. Featherless provides the best of both worlds offering unmatched model range and variety but with serverless pricing. Find the full list of supported models on the [models page](https://huggingface.co/models?inference_provider=featherless-ai&sort=trending).
3131

3232
We're quite excited to see what you'll build with this new provider!
3333

34-
Read more about Inference Providers in our [documentation](https://huggingface.co/docs/inference-providers).
34+
Read more about how to use Featherless as Inference Provider in its dedicated [documentation page](https://huggingface.co/docs/inference-providers/providers/featherless-ai).
3535

3636
## How it works
3737

@@ -62,9 +62,9 @@ Read more about Inference Providers in our [documentation](https://huggingface.c
6262

6363
#### from Python, using huggingface_hub
6464

65-
The following example shows how to use DeepSeek-R1 using Hyperbolic as the inference provider. You can use a [Hugging Face token](https://huggingface.co/settings/tokens) for automatic routing through Hugging Face, or your own Hyperbolic API key if you have one.
65+
The following example shows how to use DeepSeek-R1 using Featherless AI as the inference provider. You can use a [Hugging Face token](https://huggingface.co/settings/tokens) for automatic routing through Hugging Face, or your own Featherless AI API key if you have one.
6666

67-
Install `huggingface_hub` from source (see [instructions](https://huggingface.co/docs/huggingface_hub/installation#install-from-source)). Official support will be released soon in version v0.29.0.
67+
Install `huggingface_hub` from source (see [instructions](https://huggingface.co/docs/huggingface_hub/installation#install-from-source)). Official support will be released soon in version v0.33.0.
6868

6969
```python
7070
from huggingface_hub import InferenceClient
@@ -84,7 +84,6 @@ messages = [
8484
completion = client.chat.completions.create(
8585
model="deepseek-ai/DeepSeek-R1-0528",
8686
messages=messages,
87-
max_tokens=500
8887
)
8988

9089
print(completion.choices[0].message)
@@ -106,7 +105,6 @@ const chatCompletion = await client.chatCompletion({
106105
}
107106
],
108107
provider: "featherless-ai",
109-
max_tokens: 500
110108
});
111109

112110
console.log(chatCompletion.choices[0].message);

0 commit comments

Comments
 (0)