Skip to content

Commit 953dc44

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into fixPV2R
2 parents bfd9613 + af61424 commit 953dc44

File tree

188 files changed

+7921
-1865
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

188 files changed

+7921
-1865
lines changed

.openpublishing.redirection.azure-monitor.json

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6878,6 +6878,16 @@
68786878
"source_path_from_root": "/articles/azure-monitor/containers/prometheus-authorization-proxy.md",
68796879
"redirect_url": "/previous-versions/azure/azure-monitor/containers/prometheus-authorization-proxy",
68806880
"redirect_document_id": false
6881+
},
6882+
{
6883+
"source_path_from_root": "/articles/azure-monitor/containers/container-insights-private-link.md",
6884+
"redirect_url": "/previous-versions/azure/azure-monitor/containers/kubernetes-monitoring-private-link",
6885+
"redirect_document_id": false
6886+
},
6887+
{
6888+
"source_path_from_root": "/articles/azure-monitor/essentials/private-link-data-ingestion.md",
6889+
"redirect_url": "/previous-versions/azure/azure-monitor/containers/kubernetes-monitoring-private-link",
6890+
"redirect_document_id": false
68816891
}
68826892
]
68836893
}

.openpublishing.redirection.virtual-desktop.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -439,6 +439,11 @@
439439
"source_path_from_root": "/articles/virtual-desktop/disaster-recovery.md",
440440
"redirect_url": "/azure/virtual-desktop/disaster-recovery-concepts",
441441
"redirect_document_id": true
442+
},
443+
{
444+
"source_path_from_root": "/articles/virtual-desktop/configure-device-redirections.md",
445+
"redirect_url": "/azure/virtual-desktop/redirection-remote-desktop-protocol",
446+
"redirect_document_id": true
442447
}
443448
]
444449
}

articles/active-directory-b2c/partner-keyless.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
22
title: Tutorial to configure Keyless with Azure Active Directory B2C
33
titleSuffix: Azure AD B2C
4-
description: Tutorial to configure Sift Keyless with Azure Active Directory B2C for passwordless authentication
4+
description: Tutorial to configure Keyless with Azure Active Directory B2C for passwordless authentication
55
author: gargi-sinha
66
manager: martinco
77
ms.reviewer: kengaderdus
88
ms.service: active-directory
99
ms.topic: how-to
10-
ms.date: 06/21/2024
10+
ms.date: 08/09/2024
1111

1212
ms.author: gasinh
1313
ms.subservice: B2C
@@ -18,11 +18,11 @@ ms.subservice: B2C
1818

1919
# Tutorial: Configure Keyless with Azure Active Directory B2C
2020

21-
Learn to configure Azure Active Directory B2C (Azure AD B2C) with the Sift Keyless passwordless solution. With Azure AD B2C as an identity provider (IdP), integrate Keyless with customer applications to provide passwordless authentication. The Keyless Zero-Knowledge Biometric (ZKB) is passwordless multifactor authentication that helps eliminate fraud, phishing, and credential reuse, while enhancing the customer experience and protecting privacy.
21+
Learn to configure Azure Active Directory B2C (Azure AD B2C) with the Keyless passwordless solution. With Azure AD B2C as an identity provider (IdP), integrate Keyless with customer applications to provide passwordless authentication. The Keyless Zero-Knowledge Biometric (ZKB) is passwordless multifactor authentication that helps eliminate fraud, phishing, and credential reuse, while enhancing the customer experience and protecting privacy.
2222

2323
Go to keyless.io to learn about:
2424

25-
* [Sift Keyless](https://keyless.io/)
25+
* [Keyless](https://keyless.io/)
2626
* [How Keyless uses zero-knowledge proofs to protect your biometric data](https://keyless.io/blog/post/how-keyless-uses-zero-knowledge-proofs-to-protect-your-biometric-data)
2727

2828
## Prerequisites
@@ -42,7 +42,7 @@ The Keyless integration includes the following components:
4242

4343
* **Azure AD B2C** – authorization server that verifies user credentials. Also known as the IdP.
4444
* **Web and mobile applications** – mobile or web applications to protect with Keyless and Azure AD B2C
45-
* **The Keyless Authenticator mobile app**Sift mobile app for authentication to the Azure AD B2C enabled applications
45+
* **The Keyless Authenticator mobile app** – mobile app for authentication to the Azure AD B2C enabled applications
4646

4747
The following architecture diagram illustrates an implementation.
4848

articles/ai-studio/how-to/deploy-models-cohere-command.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -135,7 +135,7 @@ The response is as follows:
135135
```python
136136
print("Model name:", model_info.model_name)
137137
print("Model type:", model_info.model_type)
138-
print("Model provider name:", model_info.model_provider)
138+
print("Model provider name:", model_info.model_provider_name)
139139
```
140140

141141
```console
@@ -209,14 +209,12 @@ To visualize the output, define a helper function to print the stream.
209209
```python
210210
def print_stream(result):
211211
"""
212-
Prints the chat completion with streaming. Some delay is added to simulate
213-
a real-time conversation.
212+
Prints the chat completion with streaming.
214213
"""
215214
import time
216215
for update in result:
217216
if update.choices:
218217
print(update.choices[0].delta.content, end="")
219-
time.sleep(0.05)
220218
```
221219

222220
You can visualize how streaming generates content:
@@ -1364,7 +1362,7 @@ catch (RequestFailedException ex)
13641362
{
13651363
if (ex.ErrorCode == "content_filter")
13661364
{
1367-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
1365+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
13681366
}
13691367
else
13701368
{

articles/ai-studio/how-to/deploy-models-jais.md

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,8 @@ JAIS 30b Chat is an autoregressive bi-lingual LLM for **Arabic** & **English**.
2727

2828
::: zone pivot="programming-language-python"
2929

30+
## Jais chat models
31+
3032

3133

3234
You can learn more about the models in their respective model card:
@@ -103,7 +105,7 @@ The response is as follows:
103105
```python
104106
print("Model name:", model_info.model_name)
105107
print("Model type:", model_info.model_type)
106-
print("Model provider name:", model_info.model_provider)
108+
print("Model provider name:", model_info.model_provider_name)
107109
```
108110

109111
```console
@@ -177,14 +179,12 @@ To visualize the output, define a helper function to print the stream.
177179
```python
178180
def print_stream(result):
179181
"""
180-
Prints the chat completion with streaming. Some delay is added to simulate
181-
a real-time conversation.
182+
Prints the chat completion with streaming.
182183
"""
183184
import time
184185
for update in result:
185186
if update.choices:
186187
print(update.choices[0].delta.content, end="")
187-
time.sleep(0.05)
188188
```
189189

190190
You can visualize how streaming generates content:
@@ -278,6 +278,8 @@ except HttpResponseError as ex:
278278

279279
::: zone pivot="programming-language-javascript"
280280

281+
## Jais chat models
282+
281283

282284

283285
You can learn more about the models in their respective model card:
@@ -550,6 +552,8 @@ catch (error) {
550552
551553
::: zone pivot="programming-language-csharp"
552554
555+
## Jais chat models
556+
553557
554558
555559
You can learn more about the models in their respective model card:
@@ -821,7 +825,7 @@ catch (RequestFailedException ex)
821825
{
822826
if (ex.ErrorCode == "content_filter")
823827
{
824-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
828+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
825829
}
826830
else
827831
{
@@ -838,6 +842,8 @@ catch (RequestFailedException ex)
838842
839843
::: zone pivot="programming-language-rest"
840844
845+
## Jais chat models
846+
841847
842848
843849
You can learn more about the models in their respective model card:

articles/ai-studio/how-to/deploy-models-jamba.md

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,8 @@ The Jamba-Instruct model is AI21's production-grade Mamba-based large language m
2626

2727
::: zone pivot="programming-language-python"
2828

29+
## Jamba-Instruct chat models
30+
2931

3032

3133
You can learn more about the models in their respective model card:
@@ -102,7 +104,7 @@ The response is as follows:
102104
```python
103105
print("Model name:", model_info.model_name)
104106
print("Model type:", model_info.model_type)
105-
print("Model provider name:", model_info.model_provider)
107+
print("Model provider name:", model_info.model_provider_name)
106108
```
107109

108110
```console
@@ -176,14 +178,12 @@ To visualize the output, define a helper function to print the stream.
176178
```python
177179
def print_stream(result):
178180
"""
179-
Prints the chat completion with streaming. Some delay is added to simulate
180-
a real-time conversation.
181+
Prints the chat completion with streaming.
181182
"""
182183
import time
183184
for update in result:
184185
if update.choices:
185186
print(update.choices[0].delta.content, end="")
186-
time.sleep(0.05)
187187
```
188188

189189
You can visualize how streaming generates content:
@@ -277,6 +277,8 @@ except HttpResponseError as ex:
277277

278278
::: zone pivot="programming-language-javascript"
279279

280+
## Jamba-Instruct chat models
281+
280282

281283

282284
You can learn more about the models in their respective model card:
@@ -549,6 +551,8 @@ catch (error) {
549551
550552
::: zone pivot="programming-language-csharp"
551553
554+
## Jamba-Instruct chat models
555+
552556
553557
554558
You can learn more about the models in their respective model card:
@@ -820,7 +824,7 @@ catch (RequestFailedException ex)
820824
{
821825
if (ex.ErrorCode == "content_filter")
822826
{
823-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
827+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
824828
}
825829
else
826830
{
@@ -837,6 +841,8 @@ catch (RequestFailedException ex)
837841
838842
::: zone pivot="programming-language-rest"
839843
844+
## Jamba-Instruct chat models
845+
840846
841847
842848
You can learn more about the models in their respective model card:

articles/ai-studio/how-to/deploy-models-llama.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@ The response is as follows:
159159
```python
160160
print("Model name:", model_info.model_name)
161161
print("Model type:", model_info.model_type)
162-
print("Model provider name:", model_info.model_provider)
162+
print("Model provider name:", model_info.model_provider_name)
163163
```
164164

165165
```console
@@ -233,14 +233,12 @@ To visualize the output, define a helper function to print the stream.
233233
```python
234234
def print_stream(result):
235235
"""
236-
Prints the chat completion with streaming. Some delay is added to simulate
237-
a real-time conversation.
236+
Prints the chat completion with streaming.
238237
"""
239238
import time
240239
for update in result:
241240
if update.choices:
242241
print(update.choices[0].delta.content, end="")
243-
time.sleep(0.05)
244242
```
245243

246244
You can visualize how streaming generates content:
@@ -1038,7 +1036,7 @@ catch (RequestFailedException ex)
10381036
{
10391037
if (ex.ErrorCode == "content_filter")
10401038
{
1041-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
1039+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
10421040
}
10431041
else
10441042
{

articles/ai-studio/how-to/deploy-models-mistral-nemo.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ The response is as follows:
113113
```python
114114
print("Model name:", model_info.model_name)
115115
print("Model type:", model_info.model_type)
116-
print("Model provider name:", model_info.model_provider)
116+
print("Model provider name:", model_info.model_provider_name)
117117
```
118118

119119
```console
@@ -187,14 +187,12 @@ To visualize the output, define a helper function to print the stream.
187187
```python
188188
def print_stream(result):
189189
"""
190-
Prints the chat completion with streaming. Some delay is added to simulate
191-
a real-time conversation.
190+
Prints the chat completion with streaming.
192191
"""
193192
import time
194193
for update in result:
195194
if update.choices:
196195
print(update.choices[0].delta.content, end="")
197-
time.sleep(0.05)
198196
```
199197

200198
You can visualize how streaming generates content:
@@ -1385,7 +1383,7 @@ catch (RequestFailedException ex)
13851383
{
13861384
if (ex.ErrorCode == "content_filter")
13871385
{
1388-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
1386+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
13891387
}
13901388
else
13911389
{

articles/ai-studio/how-to/deploy-models-mistral-open.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -158,7 +158,7 @@ The response is as follows:
158158
```python
159159
print("Model name:", model_info.model_name)
160160
print("Model type:", model_info.model_type)
161-
print("Model provider name:", model_info.model_provider)
161+
print("Model provider name:", model_info.model_provider_name)
162162
```
163163

164164
```console
@@ -235,14 +235,12 @@ To visualize the output, define a helper function to print the stream.
235235
```python
236236
def print_stream(result):
237237
"""
238-
Prints the chat completion with streaming. Some delay is added to simulate
239-
a real-time conversation.
238+
Prints the chat completion with streaming.
240239
"""
241240
import time
242241
for update in result:
243242
if update.choices:
244243
print(update.choices[0].delta.content, end="")
245-
time.sleep(0.05)
246244
```
247245

248246
You can visualize how streaming generates content:

articles/ai-studio/how-to/deploy-models-mistral.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ The response is as follows:
143143
```python
144144
print("Model name:", model_info.model_name)
145145
print("Model type:", model_info.model_type)
146-
print("Model provider name:", model_info.model_provider)
146+
print("Model provider name:", model_info.model_provider_name)
147147
```
148148

149149
```console
@@ -217,14 +217,12 @@ To visualize the output, define a helper function to print the stream.
217217
```python
218218
def print_stream(result):
219219
"""
220-
Prints the chat completion with streaming. Some delay is added to simulate
221-
a real-time conversation.
220+
Prints the chat completion with streaming.
222221
"""
223222
import time
224223
for update in result:
225224
if update.choices:
226225
print(update.choices[0].delta.content, end="")
227-
time.sleep(0.05)
228226
```
229227

230228
You can visualize how streaming generates content:
@@ -1475,7 +1473,7 @@ catch (RequestFailedException ex)
14751473
{
14761474
if (ex.ErrorCode == "content_filter")
14771475
{
1478-
Console.WriteLine($"Your query has trigger Azure Content Safeaty: {ex.Message}");
1476+
Console.WriteLine($"Your query has trigger Azure Content Safety: {ex.Message}");
14791477
}
14801478
else
14811479
{

0 commit comments

Comments
 (0)