You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/inference-providers/guides/building-first-app.md
+11-5Lines changed: 11 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -231,12 +231,18 @@ Using the `auto` provider will automatically select the best provider for the mo
231
231
<hfoptions id="summarization">
232
232
<hfoption id="python">
233
233
234
-
Next, we'll use a powerful language model like `deepseek-ai/DeepSeek-R1-0528` from Qwen via Together AI for summarization:
234
+
Next, we'll use a powerful language model like `deepseek-ai/DeepSeek-R1-0528` from DeepSeek via an Inference Provider.
235
+
236
+
<Tip>
237
+
238
+
We'll use the `auto` provider to automatically select the best provider for the model. You can define your own priority list of providers in the [Inference Providers](https://huggingface.co/settings/inference-providers) page.
239
+
240
+
</Tip>
235
241
236
242
```python
237
243
def generate_summary(transcript):
238
-
"""Generate summary using Together AI"""
239
-
client =InferenceClient(provider="together")
244
+
"""Generate summary using an Inference Provider"""
0 commit comments