Skip to content

Commit 9fd5715

Browse files
Merge pull request #2522 from dargilco/dargilco/fix-typo-default-azure-credentials
Fix typo `AzureDefaultCredential`
2 parents 3ae0ad3 + 24a69a4 commit 9fd5715

File tree

4 files changed

+17
-17
lines changed

4 files changed

+17
-17
lines changed

articles/ai-foundry/model-inference/includes/code-create-chat-client-entra.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -23,11 +23,11 @@ Then, you can use the package to consume the model. The following example shows
2323
```python
2424
import os
2525
from azure.ai.inference import ChatCompletionsClient
26-
from azure.identity import AzureDefaultCredential
26+
from azure.identity import DefaultAzureCredential
2727

2828
model = ChatCompletionsClient(
2929
endpoint="https://<resource>.services.ai.azure.com/models",
30-
credential=AzureDefaultCredential(),
30+
credential=DefaultAzureCredential(),
3131
model="mistral-large-2407",
3232
)
3333
```
@@ -45,11 +45,11 @@ Then, you can use the package to consume the model. The following example shows
4545
```javascript
4646
import ModelClient from "@azure-rest/ai-inference";
4747
import { isUnexpected } from "@azure-rest/ai-inference";
48-
import { AzureDefaultCredential } from "@azure/identity";
48+
import { DefaultAzureCredential } from "@azure/identity";
4949

5050
const client = new ModelClient(
5151
"https://<resource>.services.ai.azure.com/models",
52-
new AzureDefaultCredential(),
52+
new DefaultAzureCredential(),
5353
"mistral-large-2407"
5454
);
5555
```
@@ -81,7 +81,7 @@ Then, you can use the package to consume the model. The following example shows
8181
```csharp
8282
ChatCompletionsClient client = new ChatCompletionsClient(
8383
new Uri("https://<resource>.services.ai.azure.com/models"),
84-
new AzureDefaultCredential(includeInteractiveCredentials: true),
84+
new DefaultAzureCredential(includeInteractiveCredentials: true),
8585
"mistral-large-2407"
8686
);
8787
```

articles/ai-foundry/model-inference/includes/code-create-embeddings-client.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -36,11 +36,11 @@ If you are using an endpoint with support for Entra ID, you can create your clie
3636
```python
3737
import os
3838
from azure.ai.inference import EmbeddingsClient
39-
from azure.identity import AzureDefaultCredential
39+
from azure.identity import DefaultAzureCredential
4040

4141
client = EmbeddingsClient(
4242
endpoint="https://<resource>.services.ai.azure.com/models",
43-
credential=AzureDefaultCredential(),
43+
credential=DefaultAzureCredential(),
4444
)
4545
```
4646

@@ -72,11 +72,11 @@ For endpoint with support for Microsoft Entra ID, you can create your client as
7272
```javascript
7373
import ModelClient from "@azure-rest/ai-inference";
7474
import { isUnexpected } from "@azure-rest/ai-inference";
75-
import { AzureDefaultCredential } from "@azure/identity";
75+
import { DefaultAzureCredential } from "@azure/identity";
7676

7777
const client = new ModelClient(
7878
"https://<resource>.services.ai.azure.com/models",
79-
new AzureDefaultCredential()
79+
new DefaultAzureCredential()
8080
);
8181
```
8282

articles/ai-studio/reference/reference-model-inference-api.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -115,11 +115,11 @@ If you are using an endpoint with support for Entra ID, you can create your clie
115115
```python
116116
import os
117117
from azure.ai.inference import ChatCompletionsClient
118-
from azure.identity import AzureDefaultCredential
118+
from azure.identity import DefaultAzureCredential
119119

120120
model = ChatCompletionsClient(
121121
endpoint=os.environ["AZUREAI_ENDPOINT_URL"],
122-
credential=AzureDefaultCredential(),
122+
credential=DefaultAzureCredential(),
123123
)
124124
```
125125

@@ -151,11 +151,11 @@ For endpoint with support for Microsoft Entra ID, you can create your client as
151151
```javascript
152152
import ModelClient from "@azure-rest/ai-inference";
153153
import { isUnexpected } from "@azure-rest/ai-inference";
154-
import { AzureDefaultCredential } from "@azure/identity";
154+
import { DefaultAzureCredential } from "@azure/identity";
155155

156156
const client = new ModelClient(
157157
process.env.AZUREAI_ENDPOINT_URL,
158-
new AzureDefaultCredential()
158+
new DefaultAzureCredential()
159159
);
160160
```
161161

articles/machine-learning/reference-model-inference-api.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -108,11 +108,11 @@ If you are using an endpoint with support for Entra ID, you can create your clie
108108
```python
109109
import os
110110
from azure.ai.inference import ChatCompletionsClient
111-
from azure.identity import AzureDefaultCredential
111+
from azure.identity import DefaultAzureCredential
112112

113113
client = ChatCompletionsClient(
114114
endpoint=os.environ["AZUREAI_ENDPOINT_URL"],
115-
credential=AzureDefaultCredential(),
115+
credential=DefaultAzureCredential(),
116116
)
117117
```
118118

@@ -144,11 +144,11 @@ For endpoint with support for Microsoft Entra ID, you can create your client as
144144
```javascript
145145
import ModelClient from "@azure-rest/ai-inference";
146146
import { isUnexpected } from "@azure-rest/ai-inference";
147-
import { AzureDefaultCredential } from "@azure/identity";
147+
import { DefaultAzureCredential } from "@azure/identity";
148148

149149
const client = new ModelClient(
150150
process.env.AZUREAI_ENDPOINT_URL,
151-
new AzureDefaultCredential()
151+
new DefaultAzureCredential()
152152
);
153153
```
154154

0 commit comments

Comments
 (0)