Skip to content

Commit 8752b31

Browse files
authored
Merge pull request #51221 from GraemeMalcolm/main
Updated to use openAI SDK
2 parents a7f39e5 + 1aa0d6a commit 8752b31

File tree

3 files changed

+49
-103
lines changed

3 files changed

+49
-103
lines changed

learn-pr/wwl-data-ai/ai-foundry-sdk/06-knowledge-check.yml

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -36,16 +36,16 @@ quiz:
3636
- content: "The Azure subscription ID"
3737
isCorrect: false
3838
explanation: "Incorrect. You don't need the Azure subscription ID to instantiate a AIProjectClient object"
39-
- content: "Which library should you use to chat with a model that is deployed to the Azure AI model inference service?"
39+
- content: "Which SDK should you use to chat with a model that is deployed in an Azure AI Foundry resource?"
4040
choices:
4141
- content: "Azure OpenAI"
42-
isCorrect: false
43-
explanation: "Incorrect. You can't use the Azure OpenAI library to chat with an Azure AI model inference model."
42+
isCorrect: true
43+
explanation: "Correct. You should use an OpenAI chat client to chat with an Azure AI model inference model."
4444
- content: "Azure Machine Learning"
4545
isCorrect: false
46-
explanation: "Incorrect. You can't use the Azure Machine Learning library to chat with an Azure AI model inference model."
47-
- content: "Azure AI Inference"
48-
isCorrect: true
49-
explanation: "Correct. Use the Azure AI Inference library to chat with an Azure AI model inference model."
46+
explanation: "Incorrect. You can't use the Azure Machine Learning SDK to chat with an Azure AI model inference model."
47+
- content: "Azure AI Language"
48+
isCorrect: false
49+
explanation: "Incorrect. The Azure AI Language SDK is used to consume Azure AI Language resources, not generative AI models."
5050

5151

learn-pr/wwl-data-ai/ai-foundry-sdk/includes/02-azure-ai-foundry-sdk.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ The core package for working with projects in the Azure AI Foundry SDK is the **
1313

1414
To use the Azure AI Projects library in Python, you can use the **pip** package installation utility to install the **azure-ai-projects** package from PyPi:
1515

16-
```python
16+
```
1717
pip install azure-ai-projects
1818
```
1919

@@ -23,7 +23,7 @@ pip install azure-ai-projects
2323

2424
To use the Azure AI Projects library in C#, add the **Azure.AI.Projects** package to your C# project:
2525

26-
```csharp
26+
```
2727
dotnet add package Azure.AI.Projects --prerelease
2828
```
2929

@@ -87,7 +87,7 @@ var projectClient = new AIProjectClient(
8787
> [!NOTE]
8888
> The code uses the default Azure credentials to authenticate when accessing the project. To enable this authentication, in addition to the **Azure.AI.Projects** package, you need to install the **Azure.Identity** package:
8989
>
90-
> `dotnet add package Azure.Identity`
90+
> `dotnet add package Azure.Identity --prerelease`
9191
9292
::: zone-end
9393

Lines changed: 39 additions & 93 deletions
Original file line numberDiff line numberDiff line change
@@ -1,92 +1,41 @@
1-
A common scenario in an AI application is to connect to a generative AI model and use *prompts* to engage in a chat-based dialog with it. You can use the Azure AI Foundry SDK to chat with models that you have deployed in your Azure AI Foundry project.
1+
A common scenario in an AI application is to connect to a generative AI model and use *prompts* to engage in a chat-based dialog with it.
22

3-
The specific libraries and code used to build a chat client depends on how the target model has been deployed in the Azure AI Foundry project. You can deploy models to the following model hosting solutions:
3+
While you can use the Azure OpenAI SDK, to connect "directly" to a model using key-based or Microsoft Entra ID authentication; when your model is deployed in an Azure AI Foundry project, you can also use the Azure AI Foundry SDK to retrieve a project client, from which you can then get an authenticated OpenAI chat client for any models deployed in the project's Azure AI Foundry resource. This approach makes it easy to write code that consumes models deployed in your project, switching between them easily by changing the model deployment name parameter.
44

5-
- **Azure AI Foundry Models**: A single endpoint for multiple models of different types, including OpenAI models and others from the Azure AI Foundry model catalog. Models are consumed through an **Azure AI Foundry** resource connection in the project (either the default **Azure AI Foundry** resource for the project or another resource connection that has been added to the project).
6-
- **Azure OpenAI**: A single endpoint for OpenAI models hosted in Azure. Models are consumed through an **Azure OpenAI** resource connection in the project.
7-
- **Serverless API**: A model-as-a-service solution in which each deployed model is accessed through a unique endpoint and hosted in the Azure AI Foundry project.
8-
- **Managed compute**: A model-as-a-service solution in which each deployed model is accessed through a unique endpoint hosted in custom compute.
9-
10-
> [!NOTE]
11-
> To deploy models to an Azure AI model inference endpoint, you must enable the **Deploy models to Azure AI model inference service** option in Azure AI Foundry.
12-
13-
In this module, we'll focus on models deployed to the **Azure AI Foundry Models** endpoint.
14-
15-
## Building a client app for Azure AI Foundry Models
16-
17-
When you have deployed models to the Azure AI model inference service, you can use the Azure AI Foundry SDK to write code that creates a **ChatCompletionsClient** object, which you can then use to chat with a deployed model. One of the benefits of using this model deployment type is that you can easily switch between deployed models by changing one parameter in your code (the model deployment name), making it a great way to test against multiple models while developing an app.
5+
> [!TIP]
6+
> You can use the OpenAI chat client provided by an Azure AI Foundry project to chat with any model deployed in the associated Azure AI Foundry resource - even non-OpenAI models, such as Microsoft Phi models.
187
198
::: zone pivot="python"
209

21-
The following Python code sample uses a **ChatCompletionsClient** object to chat with a model deployment named **phi-4-model**.
22-
23-
```python
24-
from azure.identity import DefaultAzureCredential
25-
from azure.ai.inference.models import SystemMessage, UserMessage
26-
27-
try:
28-
29-
## Get a chat client
30-
inference_endpoint = f"https://{urlparse(project_endpoint).netloc}/models"
31-
32-
credential = DefaultAzureCredential()
33-
34-
chat_client = ChatCompletionsClient(
35-
endpoint=inference_endpoint,
36-
credential=credential,
37-
credential_scopes=["https://ai.azure.com/.default"])
38-
39-
# Get a chat completion based on a user-provided prompt
40-
user_prompt = input("Enter a question:")
41-
42-
response = chat_client.complete(
43-
model="phi-4-model",
44-
messages=[
45-
SystemMessage("You are a helpful AI assistant that answers questions."),
46-
UserMessage(user_prompt)
47-
],
48-
)
49-
print(response.choices[0].message.content)
50-
51-
except Exception as ex:
52-
print(ex)
53-
```
54-
55-
> [!NOTE]
56-
> The **ChatCompletionsClient** class uses **Azure AI Inference** library. In addition to the **azure-identity** package discussed previously, the sample code shown here assumes that the **azure-ai-inference** package has been installed:
57-
>
58-
> `pip install azure-ai-inference`
59-
60-
### Using the Azure OpenAI SDK
61-
62-
In the Azure AI Foundry SDK for Python, the **AIProjectClient** class provides a **get_azure_openai_client()** method that you can use to create an Azure OpenAI client object. You can then use the classes and methods defined in the Azure OpenAI SDK to consume an OpenAI model deployed to Azure Foundry Models.
63-
64-
The following Python code sample uses the Azure AI Foundry and Azure OpenAI SDKs to chat with a model deployment named **gpt-4o-model**.
10+
The following Python code sample uses the **get_azure_openai_client()** method in the Azure AI project's **inference** operations object to get an OpenAI client with which to chat with a model that has been deployed in the project'a Azure AI Foundry resource.
6511

6612
```python
6713
from azure.identity import DefaultAzureCredential
6814
from azure.ai.projects import AIProjectClient
6915
from openai import AzureOpenAI
7016

71-
7217
try:
73-
# Initialize the project client
74-
project_connection_string = "https://......"
18+
19+
# connect to the project
20+
project_endpoint = "https://......"
7521
project_client = AIProjectClient(
76-
credential=DefaultAzureCredential(),
77-
endpoint=project_endpoint)
78-
79-
## Get an Azure OpenAI chat client
80-
openai_client = project_client.inference.get_azure_openai_client(api_version="2024-10-21")
22+
credential=DefaultAzureCredential(),
23+
endpoint=project_endpoint,
24+
)
25+
26+
# Get a chat client
27+
chat_client = project_client.inference.get_azure_openai_client(api_version="2024-10-21")
8128

8229
# Get a chat completion based on a user-provided prompt
8330
user_prompt = input("Enter a question:")
84-
response = openai_client.chat.completions.create(
85-
model="gpt-4o-model",
86-
messages=[
87-
{"role": "system", "content": "You are a helpful AI assistant that answers questions."},
88-
{"role": "user", "content": user_prompt},
31+
32+
response = chat_client.complete(
33+
model=model_deployment_name,
34+
[
35+
{"role": "system", "content": "You are a helpful AI assistant."},
36+
{"role": "user", "content": user_prompt}
8937
]
38+
],
9039
)
9140
print(response.choices[0].message.content)
9241

@@ -103,14 +52,13 @@ except Exception as ex:
10352

10453
::: zone pivot="csharp"
10554

106-
The following C# code sample uses a **ChatCompletionsClient** object to chat with a model deployment named **phi-4-model**.
55+
The following C# code sample uses the **GetAzureOpenAIChatClient()** method of the Azure AI project object to get an OpenAI client with which to chat with a model that has been deployed in the project's Azure AI Foundry resource.
10756

10857
```csharp
109-
using System;
110-
using Azure;
111-
using Azure.AI.Projects;
11258
using Azure.Identity;
113-
using Azure.AI.Inference;
59+
using Azure.AI.Projects;
60+
using Azure.AI.OpenAI;
61+
using OpenAI.Chat;;
11462

11563
namespace my_foundry_client
11664
{
@@ -127,24 +75,23 @@ namespace my_foundry_client
12775
new DefaultAzureCredential());
12876

12977
// Get a chat client
130-
ChatCompletionsClient chatClient = projectClient.GetChatCompletionsClient();
78+
ChatClient chatClient = projectClient.GetAzureOpenAIChatClient(
79+
deploymentName: model_deployment,
80+
connectionName: null,
81+
apiVersion: "2024-10-21");
13182

13283
// Get a chat completion based on a user-provided prompt
13384
Console.WriteLine("Enter a question:");
13485
var user_prompt = Console.ReadLine();
86+
87+
ChatCompletion completion = openaiClient.CompleteChat(
88+
[
89+
new SystemChatMessage("You are a helpful AI assistant."),
90+
new UserChatMessage(user_prompt)
91+
]);
13592

136-
var requestOptions = new ChatCompletionsOptions()
137-
{
138-
Model = "phi-4-model",
139-
Messages =
140-
{
141-
new ChatRequestSystemMessage("You are a helpful AI assistant that answers questions."),
142-
new ChatRequestUserMessage(user_prompt),
143-
}
144-
};
145-
146-
Response<ChatCompletions> response = chatClient.Complete(requestOptions);
147-
Console.WriteLine(response.Value.Content);
93+
Console.WriteLine(completion.Content[0].Text);
94+
14895
}
14996
catch (Exception ex)
15097
{
@@ -156,9 +103,8 @@ namespace my_foundry_client
156103
```
157104

158105
> [!NOTE]
159-
> The **ChatCompletionsClient** class uses **Azure AI Inference** library. In addition to the **Azure.AI.Projects** and **Azure.Identity** packages discussed previously, the sample code shown here assumes that the **Azure.AI.Inference** package has been installed:
106+
> In addition to the **azure-ai-projects** and **azure-identity** packages discussed previously, the sample code shown here assumes that the **Azure.AI.OpenAI** package has been installed:
160107
>
161-
> `dotnet add package Azure.AI.Inference`
108+
> `dotnet add package Azure.AI.OpenAI --prerelease`
162109
163110
::: zone-end
164-

0 commit comments

Comments
 (0)