Skip to content

Commit e1eaa53

Browse files
Merge pull request #2517 from santiagxf/santiagxf-patch-1
Update reference-model-inference-api.md
2 parents 43e0779 + e6602ac commit e1eaa53

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

articles/ai-studio/reference/reference-model-inference-api.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Having a uniform way to consume foundational models allow developers to realize
3636

3737
## Availability
3838

39-
The Azure AI Model Inference API is available in the following models:
39+
The Azure AI Model Inference API is available in the following models/systems:
4040

4141
Models deployed to [serverless API endpoints](../how-to/deploy-models-serverless.md):
4242

@@ -58,6 +58,11 @@ Models deployed to [managed inference](../concepts/deployments-overview.md):
5858
> * [Phi-3](../how-to/deploy-models-phi-3.md), and [Phi-4](../how-to/deploy-models-phi-4.md) family of models
5959
> * [Mistral](../how-to/deploy-models-mistral-open.md) and [Mixtral](../how-to/deploy-models-mistral-open.md?tabs=mistral-8x7B-instruct) family of models
6060
61+
Models deployed to [Azure AI model inference in Azure AI Services](../../ai-foundry/model-inference/overview.md):
62+
63+
> [!div class="checklist"]
64+
> * See [supported models](../../ai-foundry/model-inference/concepts/models.md).
65+
6166
The API is compatible with Azure OpenAI model deployments.
6267

6368
> [!NOTE]
@@ -81,6 +86,9 @@ The API indicates how developers can consume predictions for the following modal
8186

8287
You can use streamlined inference clients in the language of your choice to consume predictions from models running the Azure AI model inference API.
8388

89+
> [!IMPORTANT]
90+
> When working with the Azure AI model inference endpoint (preview), the base URL to connect to is of the form `https://<resource-name>.services.ai.azure.com/models`. Use this URL with the parameter `endpoint`. If using REST APIs, such is the base URL you have to append to the modality you want to consume. Read about [how to use the Azure AI model inference endpoint](../../ai-foundry/model-inference/how-to/inference.md).
91+
8492
# [Python](#tab/python)
8593

8694
Install the package `azure-ai-inference` using your package manager, like pip:

0 commit comments

Comments
 (0)