Skip to content

Commit 2f56129

Browse files
committed
use auto in first api call
1 parent 4316d65 commit 2f56129

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/inference-providers/guides/first-api-call.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,15 +14,15 @@ This guide assumes you have a Hugging Face account. If you don't have one, you c
1414

1515
## Step 1: Find a Model on the Hub
1616

17-
Visit the [Hugging Face Hub](https://huggingface.co/models) and look for models with the "Inference Providers" filter, you can select the provider that you want. We'll go with `fal`.
17+
Visit the [Hugging Face Hub](https://huggingface.co/models?pipeline_tag=text-to-image&inference_provider=fal-ai,hf-inference,nebius,nscale,replicate,together&sort=trending) and look for models with the "Inference Providers" filter, you can select the provider that you want. We'll go with `fal`.
1818

1919
![search image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers-guides/search.png)
2020

2121
For this example, we'll use [FLUX.1-schnell](https://huggingface.co/black-forest-labs/FLUX.1-schnell), a powerful text-to-image model. Next, navigate to the model page and scroll down to find the inference widget on the right side.
2222

2323
## Step 2: Try the Interactive Widget
2424

25-
Before writing any code, try the widget directly on the model page:
25+
Before writing any code, try the widget directly on the [model page](https://huggingface.co/black-forest-labs/FLUX.1-dev?inference_provider=fal-ai):
2626

2727
![widget image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers-guides/widget.png)
2828

@@ -42,7 +42,7 @@ You'll need a Hugging Face account (free at [huggingface.co](https://huggingface
4242

4343
## Step 3: From Clicks to Code
4444

45-
Now let's replicate this with Python. Click the **"View Code Snippets"** button in the widget to see the generated code snippets.
45+
Now let's replicate this with Python. Click the **"View Code Snippets"** button in the widget to see the [generated code snippets](https://huggingface.co/black-forest-labs/FLUX.1-dev?inference_api=true&language=python&inference_provider=auto).
4646

4747
![code snippets image](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/inference-providers-guides/code-snippets.png)
4848

@@ -73,7 +73,7 @@ import os
7373
from huggingface_hub import InferenceClient
7474

7575
client = InferenceClient(
76-
provider="fal-ai",
76+
provider="auto",
7777
api_key=os.environ["HF_TOKEN"],
7878
)
7979

@@ -102,7 +102,7 @@ import { InferenceClient } from "@huggingface/inference";
102102
const client = new InferenceClient(process.env.HF_TOKEN);
103103

104104
const image = await client.textToImage({
105-
provider: "fal-ai",
105+
provider: "auto",
106106
model: "black-forest-labs/FLUX.1-schnell",
107107
inputs: "Astronaut riding a horse",
108108
parameters: { num_inference_steps: 5 },

0 commit comments

Comments
 (0)