You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- An Azure AI project as explained at [Create a project in Azure AI Foundry portal](../create-projects.md).
21
-
- A model supporting the [Azure AI Model Inference API](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md?tabs=python) deployed. In this example, we use a `Mistral-Large` deployment, but use any model of your preference. For using embeddings capabilities in LlamaIndex, you need an embedding model like `cohere-embed-v3-multilingual`.
20
+
- An Azure AI project as explained at [Create a project for Azure AI Foundry](../create-projects.md).
21
+
- A model that supports the [Azure AI Model Inference API](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md?tabs=python) deployed. This article uses a `Mistral-Large` deployment. You can use any model. For using embeddings capabilities in LlamaIndex, you need an embedding model like `cohere-embed-v3-multilingual`.
22
22
23
23
- You can follow the instructions at [Deploy models as serverless API deployments](../deploy-models-serverless.md).
24
24
25
25
- Python **3.10** or later installed, including pip.
26
-
- Semantic Kernel installed. You can do it with:
26
+
- Semantic Kernel installed. You can use the following command:
27
27
28
28
```bash
29
29
pip install semantic-kernel
30
30
```
31
31
32
-
- In this example, we're working with the Model Inference API, so we need to install the relevant Azure dependencies. You can do it with:
32
+
- This article uses the Model Inference API, so install the relevant Azure dependencies. You can use the following command:
33
33
34
34
```bash
35
35
pip install semantic-kernel[azure]
36
36
```
37
37
38
38
## Configure the environment
39
39
40
-
To use LLMs deployed in Azure AI Foundry portal, you need the endpoint and credentials to connect to it. Follow these steps to get the information you need from the model you want to use:
40
+
To use language models deployed in Azure AI Foundry portal, you need the endpoint and credentials to connect to your project. Follow these steps to get the information you need from the model:
> The client automatically reads the environment variables `AZURE_AI_INFERENCE_ENDPOINT` and `AZURE_AI_INFERENCE_API_KEY` to connect to the model. However, you can also pass the endpoint and key directly to the client via the `endpoint` and `api_key` parameters on the constructor.
68
+
> The client automatically reads the environment variables `AZURE_AI_INFERENCE_ENDPOINT` and `AZURE_AI_INFERENCE_API_KEY` to connect to the model. You could instead pass the endpoint and key directly to the client by using the `endpoint` and `api_key` parameters on the constructor.
69
69
70
70
Alternatively, if your endpoint support Microsoft Entra ID, you can use the following code to create the client:
> When using Microsoft Entra ID, make sure that the endpoint was deployed with that authentication method and that you have the required permissions to invoke it.
83
+
>If you use Microsoft Entra ID, make sure that the endpoint was deployed with that authentication method and that you have the required permissions to invoke it.
Copy file name to clipboardExpand all lines: articles/ai-foundry/openai/includes/fine-tuning-studio.md
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -216,6 +216,9 @@ Your job might be queued behind other jobs on the system. Training your model ca
216
216
217
217
When each training epoch completes a checkpoint is generated. A checkpoint is a fully functional version of a model which can both be deployed and used as the target model for subsequent fine-tuning jobs. Checkpoints can be particularly useful, as they may provide snapshots prior to overfitting. When a fine-tuning job completes you will have the three most recent versions of the model available to deploy.
218
218
219
+
> [!NOTE]
220
+
> When copying a checkpoint from a source account, the same checkpoint name is retained in the destination account. Ensure you use this exact name for fine-tuning, deployment, or any other operation in the destination account. This checkpoint will not appear in the UI or in the `list checkpoints` API.
221
+
219
222
## Pause and resume
220
223
221
224
You can track progress in both fine-tuning views of the AI Foundry portal. You'll see your job go through the same statuses as normal fine tuning jobs (queued, running, succeeded).
Copy file name to clipboardExpand all lines: articles/search/search-what-is-azure-search.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -47,7 +47,7 @@ Architecturally, a search service sits between the external data stores that con
47
47
48
48

49
49
50
-
On the indexing side, if your content is on Azure, you can used indexers and skillsets for automated and AI-enriched indexing. Or, create a logic app workflow for equivalent automation over an even broader set of supported data sources.
50
+
On the indexing side, if your content is on Azure, you can use indexers and skillsets for automated and AI-enriched indexing. Or, create a logic app workflow for equivalent automation over an even broader set of supported data sources.
51
51
52
52
On the retrieval side, your app can be an agent or tool, a bot, or any client that sends requests to a search index or knowledge agent.
0 commit comments