You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/semantic-kernel.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ In this article, you learn how to use [Semantic Kernel](/semantic-kernel/overvie
19
19
20
20
- An [Azure subscription](https://azure.microsoft.com).
21
21
- An Azure AI project as explained at [Create a project in Azure AI Foundry portal](../create-projects.md).
22
-
- A model supporting the [Azure AI model inference API](../../reference/reference-model-inference-api?tabs=python) deployed. In this example, we use a `Mistral-Large` deployment, but use any model of your preference. For using embeddings capabilities in LlamaIndex, you need an embedding model like `cohere-embed-v3-multilingual`.
22
+
- A model supporting the [Azure AI model inference API](../../reference/reference-model-inference-api.md?tabs=python) deployed. In this example, we use a `Mistral-Large` deployment, but use any model of your preference. For using embeddings capabilities in LlamaIndex, you need an embedding model like `cohere-embed-v3-multilingual`.
23
23
24
24
- You can follow the instructions at [Deploy models as serverless APIs](../deploy-models-serverless.md).
0 commit comments