Skip to content

Commit a99a371

Browse files
Freshness.
1 parent d503ad6 commit a99a371

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

articles/ai-foundry/how-to/develop/semantic-kernel.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to Develop applications with Semantic Kernel and Azure AI
55
author: lgayhardt
66
ms.author: lagayhar
77
ms.reviewer: taochen
8-
ms.date: 02/27/2025
8+
ms.date: 10/20/2025
99
ms.topic: how-to
1010
ms.service: azure-ai-foundry
1111
---
@@ -29,15 +29,15 @@ In this article, you learn how to use [Semantic Kernel](/semantic-kernel/overvie
2929
pip install semantic-kernel
3030
```
3131

32-
- This article uses the Model Inference API, so install the relevant Azure dependencies. You can use the following command:
32+
- This article uses the Model Inference API, so install the relevant Azure dependencies. You can use the following command:
3333

3434
```bash
3535
pip install semantic-kernel[azure]
3636
```
3737

3838
## Configure the environment
3939

40-
To use language models deployed in Azure AI Foundry portal, you need the endpoint and credentials to connect to your project. Follow these steps to get the information you need from the model you want to use:
40+
To use language models deployed in Azure AI Foundry portal, you need the endpoint and credentials to connect to your project. Follow these steps to get the information you need from the model:
4141

4242
[!INCLUDE [tip-left-pane](../../includes/tip-left-pane.md)]
4343

@@ -80,7 +80,7 @@ chat_completion_service = AzureAIInferenceChatCompletion(ai_model_id="<deploymen
8080
```
8181

8282
> [!NOTE]
83-
> IF you use Microsoft Entra ID, make sure that the endpoint was deployed with that authentication method and that you have the required permissions to invoke it.
83+
> If you use Microsoft Entra ID, make sure that the endpoint was deployed with that authentication method and that you have the required permissions to invoke it.
8484

8585
### Azure OpenAI models
8686

@@ -104,7 +104,7 @@ chat_completion_service = AzureAIInferenceChatCompletion(
104104
105105
## Inference parameters
106106
107-
You can configure how inference is performed by using the `AzureAIInferenceChatPromptExecutionSettings` class:
107+
You can configure how to perform inference by using the `AzureAIInferenceChatPromptExecutionSettings` class:
108108
109109
```python
110110
from semantic_kernel.connectors.ai.azure_ai_inference import AzureAIInferenceChatPromptExecutionSettings

0 commit comments

Comments
 (0)