You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/how-to/quickstart-ai-project.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -125,10 +125,10 @@ You can use any of the supported SDKs to get predictions out from the endpoint.
125
125
126
126
* OpenAI SDK
127
127
* Azure OpenAI SDK
128
-
* Azure AI Inference SDK
129
-
* Azure AI Foundry SDK
128
+
* Azure AI Inference package
129
+
* Azure AI Projects package
130
130
131
-
See the [supported languages and SDKs](../supported-languages.md) section for more details and examples. The following example shows how to use the Azure AI model inference SDK with the newly deployed model:
131
+
See the [supported languages and SDKs](../supported-languages.md) section for more details and examples. The following example shows how to use the Azure AI Inference package with the newly deployed model:
0 commit comments