You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/use-chat-completions/csharp.md
+19-7Lines changed: 19 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -47,18 +47,23 @@ First, create the client to consume the model. The following code uses an endpoi
47
47
ChatCompletionsClient client = new ChatCompletionsClient(
48
48
new Uri(Environment.GetEnvironmentVariable("AZURE_INFERENCE_ENDPOINT")),
49
49
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_INFERENCE_CREDENTIAL")),
50
-
"mistral-large-2407"
51
50
);
52
51
```
53
52
54
53
If you have configured the resource to with **Microsoft Entra ID** support, you can use the following code snippet to create a client.
55
54
56
55
57
56
```csharp
57
+
TokenCredential credential = new DefaultAzureCredential(includeInteractiveCredentials: true);
58
+
AzureAIInferenceClientOptions clientOptions = new AzureAIInferenceClientOptions();
59
+
BearerTokenAuthenticationPolicy tokenPolicy = new BearerTokenAuthenticationPolicy(credential, new string[] { "https://cognitiveservices.azure.com/.default" });
ASSISTANT: The chart illustrates that larger models tend to perform better in quality, as indicated by their size in billions of parameters. However, there are exceptions to this trend, such as Phi-3-medium and Phi-3-small, which outperform smaller models in quality. This suggests that while larger models generally have an advantage, there might be other factors at play that influence a model's performance.
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/use-chat-reasoning/csharp.md
+13-5Lines changed: 13 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,8 +42,7 @@ First, create the client to consume the model. The following code uses an endpoi
42
42
```csharp
43
43
ChatCompletionsClient client = new ChatCompletionsClient(
44
44
new Uri("https://<resource>.services.ai.azure.com/models"),
45
-
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_INFERENCE_CREDENTIAL")),
46
-
"DeepSeek-R1"
45
+
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_INFERENCE_CREDENTIAL"))
47
46
);
48
47
```
49
48
@@ -53,10 +52,16 @@ ChatCompletionsClient client = new ChatCompletionsClient(
53
52
If you have configured the resource to with **Microsoft Entra ID** support, you can use the following code snippet to create a client.
54
53
55
54
```csharp
55
+
TokenCredential credential = new DefaultAzureCredential(includeInteractiveCredentials: true);
56
+
AzureAIInferenceClientOptions clientOptions = new AzureAIInferenceClientOptions();
57
+
BearerTokenAuthenticationPolicy tokenPolicy = new BearerTokenAuthenticationPolicy(credential, new string[] { "https://cognitiveservices.azure.com/.default" });
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/includes/use-embeddings/csharp.md
+25-9Lines changed: 25 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -49,19 +49,23 @@ First, create the client to consume the model. The following code uses an endpoi
49
49
```csharp
50
50
EmbeddingsClient client = new EmbeddingsClient(
51
51
new Uri(Environment.GetEnvironmentVariable("AZURE_INFERENCE_ENDPOINT")),
52
-
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_INFERENCE_CREDENTIAL")),
53
-
"text-embedding-3-small"
52
+
new AzureKeyCredential(Environment.GetEnvironmentVariable("AZURE_INFERENCE_CREDENTIAL"))
54
53
);
55
54
```
56
55
57
-
If you have configured the resource to with **Microsoft Entra ID** support, you can use the following code snippet to create a client.
58
-
56
+
If you configured the resource to with **Microsoft Entra ID** support, you can use the following code snippet to create a client. Note that here `includeInteractiveCredentials` is set to `true` only for demonstration purposes so authentication can happen using the web browser. On production workloads, you should remove such parameter.
59
57
60
58
```csharp
59
+
TokenCredential credential = new DefaultAzureCredential(includeInteractiveCredentials: true);
60
+
AzureAIInferenceClientOptions clientOptions = new AzureAIInferenceClientOptions();
61
+
BearerTokenAuthenticationPolicy tokenPolicy = new BearerTokenAuthenticationPolicy(credential, new string[] { "https://cognitiveservices.azure.com/.default" });
0 commit comments