You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* An AI project connected to your Azure AI Services resource. You call follow the steps at [Configure Azure AI model inference service in my project](../../how-to/configure-project-connection.md) in Azure AI Foundry.
13
+
* An AI project resource.
14
+
15
+
* The feature **Deploy models to Azure AI model inference service** on.
16
+
17
+
:::image type="content" source="../media/quickstart-ai-project/ai-project-inference-endpoint.gif" alt-text="An animation showing how to turn on the Deploy models to Azure AI model inference service feature in Azure AI Foundry portal." lightbox="../media/quickstart-ai-project/ai-project-inference-endpoint.gif":::
14
18
15
19
## Add a connection
16
20
@@ -50,4 +54,4 @@ You can see the model deployments available in the connected resource by followi
50
54
51
55
5. The details page shows information about the specific deployment. If you want to test the model, you can use the option **Open in playground**.
52
56
53
-
6. The Azure AI Foundry playground is displayed, where you can interact with the given model.
57
+
6. The Azure AI Foundry playground is displayed, where you can interact with the given model.
0 commit comments