Skip to content

Commit 44ff733

Browse files
author
Ryan Wike
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into quickstartupdate
2 parents 1857705 + b095e6c commit 44ff733

File tree

1,446 files changed

+63750
-9433
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,446 files changed

+63750
-9433
lines changed

.openpublishing.publish.config.json

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -253,7 +253,7 @@
253253
"url": "https://github.com/Azure-Samples/function-app-arm-templates",
254254
"branch": "main",
255255
"branch_mapping": {}
256-
},
256+
},
257257
{
258258
"path_to_root": "functions-azure-product",
259259
"url": "https://github.com/Azure/Azure-Functions",
@@ -1237,6 +1237,7 @@
12371237
"redirection_files": [
12381238
".openpublishing.redirection.active-directory.json",
12391239
".openpublishing.redirection.ansible.json",
1240+
".openpublishing.redirection.api-center.json",
12401241
".openpublishing.redirection.api-management.json",
12411242
".openpublishing.redirection.app-service.json",
12421243
".openpublishing.redirection.asc-for-iot.json",
@@ -1354,6 +1355,7 @@
13541355
"articles/object-anchors/.openpublishing.redirection.object-anchors.json",
13551356
"articles/operator-insights/.openpublishing.redirection.operator-insights.json",
13561357
"articles/operator-nexus/.openpublishing.redirection.operator-nexus.json",
1358+
"articles/operator-service-manager/.openpublishing.redirection.operator-service-manager.json",
13571359
"articles/peering-service/.openpublishing.redirection.peering-service.json",
13581360
"articles/postgresql/.openpublishing.redirection.postgresql.json",
13591361
"articles/route-server/.openpublishing.redirection.route-server.json",
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
{
2+
"redirections": [
3+
{
4+
"source_path_from_root": "/articles/api-center/use-vscode-extension-copilot.md",
5+
"redirect_url": "/azure/api-center/use-vscode-extension",
6+
"redirect_document_id": false
7+
}
8+
]
9+
}

.openpublishing.redirection.azure-monitor.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6649,6 +6649,11 @@
66496649
"redirect_url": "/azure/azure-functions/functions-monitoring",
66506650
"redirect_document_id": false
66516651
},
6652+
{
6653+
"source_path_from_root": "/articles/azure-monitor/app/resources-roles-access-control.md",
6654+
"redirect_url": "/azure/azure-monitor//roles-permissions-security",
6655+
"redirect_document_id": false
6656+
},
66526657
{
66536658
"source_path_from_root": "/articles/azure-monitor/app/java-standalone-arguments.md",
66546659
"redirect_url": "/azure/azure-monitor/app/java-get-started-supplemental",
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
{
2+
"redirections": [
3+
{
4+
"source_path_from_root": "/articles/operator-service-manager/how-to-use-azure-operator-service-manager-cli-extension.md",
5+
"redirect_url": "/azure/operator-service-manager/concepts-about-azure-operator-service-manager-cli",
6+
"redirect_document_id": false
7+
}
8+
]
9+
}

.openpublishing.redirection.sentinel.json

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,85 @@
11
{
22
"redirections": [
3+
{
4+
"source_path": "articles/sentinel/automate-responses-with-playbooks.md#azure-logic-apps-basic-concepts",
5+
"redirect_url": "/azure/sentinel/playbooks/logic-apps-playbooks",
6+
"redirect_document_id": true
7+
},
8+
{
9+
"source_path": "articles/sentinel/automate-responses-with-playbooks.md#how-to-run-a-playbook",
10+
"redirect_url": "/azure/sentinel/playbooks/run-manage-playbooks",
11+
"redirect_document_id": true
12+
},
13+
{
14+
"source_path": "articles/sentinel/automate-responses-with-playbooks.md#recommended-playbooks",
15+
"redirect_url": "/azure/sentinel/playbooks/playbooks-recommendations",
16+
"redirect_document_id": true
17+
},
18+
{
19+
"source_path": "articles/sentinel/authenticate-playbooks-to-sentinel.md",
20+
"redirect_url": "/azure/sentinel/automation/authenticate-playbooks-to-sentinel",
21+
"redirect_document_id": true
22+
},
23+
{
24+
"source_path": "articles/sentinel/automate-responses-with-playbooks.md",
25+
"redirect_url": "/azure/sentinel/automation/automate-responses-with-playbooks",
26+
"redirect_document_id": true
27+
},
28+
{
29+
"source_path": "articles/sentinel/automation.md",
30+
"redirect_url": "/azure/sentinel/automation/automation",
31+
"redirect_document_id": true
32+
},
33+
{
34+
"source_path": "articles/sentinel/create-playbooks.md",
35+
"redirect_url": "/azure/sentinel/automation/create-playbooks",
36+
"redirect_document_id": true
37+
},
38+
{
39+
"source_path": "articles/sentinel/create-tasks-playbook.md",
40+
"redirect_url": "/azure/sentinel/automation/create-tasks-playbook",
41+
"redirect_document_id": true
42+
},
43+
{
44+
"source_path": "articles/sentinel/define-playbook-access-restrictions.md",
45+
"redirect_url": "/azure/sentinel/automation/define-playbook-access-restrictions",
46+
"redirect_document_id": true
47+
},
48+
{
49+
"source_path": "articles/sentinel/logic-apps-playbooks.md",
50+
"redirect_url": "/azure/sentinel/automation/logic-apps-playbooks",
51+
"redirect_document_id": true
52+
},
53+
{
54+
"source_path": "articles/sentinel/migrate-playbooks-to-automation-rules.md",
55+
"redirect_url": "/azure/sentinel/automation/migrate-playbooks-to-automation-rules",
56+
"redirect_document_id": true
57+
},
58+
{
59+
"source_path": "articles/sentinel/playbook-recommendations.md",
60+
"redirect_url": "/azure/sentinel/automation/playbook-recommendations",
61+
"redirect_document_id": true
62+
},
63+
{
64+
"source_path": "articles/sentinel/playbook-triggers-actions.md",
65+
"redirect_url": "/azure/sentinel/automation/playbook-triggers-actions",
66+
"redirect_document_id": true
67+
},
68+
{
69+
"source_path": "articles/sentinel/run-playbooks.md",
70+
"redirect_url": "/azure/sentinel/automation/run-playbooks",
71+
"redirect_document_id": true
72+
},
73+
{
74+
"source_path": "articles/sentinel/tutorial-respond-threats-playbook.md",
75+
"redirect_url": "/azure/sentinel/automation/tutorial-respond-threats-playbook",
76+
"redirect_document_id": true
77+
},
78+
{
79+
"source_path": "articles/sentinel/use-playbook-templates.md",
80+
"redirect_url": "/azure/sentinel/automation/use-playbook-templates",
81+
"redirect_document_id": true
82+
},
383
{
484
"source_path": "articles/sentinel/sap/deploy-data-connector-agent-container-other-methods.md",
585
"redirect_url": "/azure/sentinel/sap/deploy-data-connector-agent-container",

articles/ai-services/cognitive-services-virtual-networks.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -370,11 +370,9 @@ Currently, only IPv4 addresses are supported. Each Azure AI services resource su
370370
371371
To grant access from your on-premises networks to your Azure AI services resource with an IP network rule, identify the internet-facing IP addresses used by your network. Contact your network administrator for help.
372372
373-
If you use Azure ExpressRoute on-premises for public peering or Microsoft peering, you need to identify the NAT IP addresses. For more information, see [What is Azure ExpressRoute](../expressroute/expressroute-introduction.md).
373+
If you use Azure ExpressRoute on-premises for Microsoft peering, you need to identify the NAT IP addresses. For more information, see [What is Azure ExpressRoute](../expressroute/expressroute-introduction.md).
374374
375-
For public peering, each ExpressRoute circuit by default uses two NAT IP addresses. Each is applied to Azure service traffic when the traffic enters the Microsoft Azure network backbone. For Microsoft peering, the NAT IP addresses that are used are either customer provided or supplied by the service provider. To allow access to your service resources, you must allow these public IP addresses in the resource IP firewall setting.
376-
377-
To find your public peering ExpressRoute circuit IP addresses, [open a support ticket with ExpressRoute](https://portal.azure.com/#blade/Microsoft_Azure_Support/HelpAndSupportBlade/overview) use the Azure portal. For more information, see [NAT requirements for Azure public peering](../expressroute/expressroute-nat.md#nat-requirements-for-azure-public-peering).
375+
For Microsoft peering, the NAT IP addresses that are used are either customer provided or supplied by the service provider. To allow access to your service resources, you must allow these public IP addresses in the resource IP firewall setting.
378376
379377
### Managing IP network rules
380378

articles/ai-services/computer-vision/Tutorials/liveness.md

Lines changed: 19 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -28,17 +28,18 @@ The liveness detection solution successfully defends against various spoof types
2828
- Once you have your Azure subscription, <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesFace" title="Create a Face resource" target="_blank">create a Face resource</a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.
2929
- You need the key and endpoint from the resource you create to connect your application to the Face service. You'll paste your key and endpoint into the code later in the quickstart.
3030
- You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
31-
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android). To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
31+
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android) and web. To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
3232

3333
## Perform liveness detection
3434

35-
The liveness solution integration involves two different components: a mobile application and an app server/orchestrator.
35+
The liveness solution integration involves two different components: a frontend mobile/web application and an app server/orchestrator.
3636

3737
### Integrate liveness into mobile application
3838

39-
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications:
39+
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
4040
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
4141
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
42+
- For JavaScript Web, follow the instructions in the [Web sample](https://aka.ms/liveness-sample-web)
4243

4344
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
4445

@@ -48,9 +49,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
4849

4950
:::image type="content" source="../media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="../media/liveness/liveness-diagram.jpg":::
5051

51-
1. The mobile application starts the liveness check and notifies the app server.
52+
1. The frontend application starts the liveness check and notifies the app server.
5253

53-
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token.
54+
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token. More information regarding each request parameter involved in creating a liveness session is referenced in [Liveness Create Session Operation](https://aka.ms/face-api-reference-createlivenesssession).
5455

5556
```json
5657
Request:
@@ -70,9 +71,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
7071
}
7172
```
7273

73-
1. The app server provides the session-authorization-token back to the mobile application.
74+
1. The app server provides the session-authorization-token back to the frontend application.
7475

75-
1. The mobile application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
76+
1. The frontend application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
7677

7778
```kotlin
7879
mServiceOptions?.setTokenCredential(com.azure.android.core.credential.TokenCredential { _, callback ->
@@ -84,11 +85,15 @@ The high-level steps involved in liveness orchestration are illustrated below:
8485
serviceOptions?.authorizationToken = "<INSERT_TOKEN_HERE>"
8586
```
8687

88+
```javascript
89+
azureAIVisionFaceAnalyzer.token = "<INSERT_TOKEN_HERE>"
90+
```
91+
8792
1. The SDK then starts the camera, guides the user to position correctly and then prepares the payload to call the liveness detection service endpoint.
8893

8994
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the mobile application that the liveness check has been completed.
9095

91-
1. The mobile application relays the liveness check completion to the app server.
96+
1. The frontend application relays the liveness check completion to the app server.
9297

9398
1. The app server can now query for the liveness detection result from the Azure AI Vision Face service.
9499

@@ -122,7 +127,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
122127
"width": 409,
123128
"height": 395
124129
},
125-
"fileName": "video.webp",
130+
"fileName": "content.bin",
126131
"timeOffsetWithinFile": 0,
127132
"imageType": "Color"
128133
},
@@ -175,7 +180,7 @@ Use the following tips to ensure that your input images give the most accurate r
175180

176181
The high-level steps involved in liveness with verification orchestration are illustrated below:
177182
1. Provide the verification reference image by either of the following two methods:
178-
- The app server provides the reference image when creating the liveness session.
183+
- The app server provides the reference image when creating the liveness session. More information regarding each request parameter involved in creating a liveness session with verification is referenced in [Liveness With Verify Create Session Operation](https://aka.ms/face-api-reference-createlivenesswithverifysession).
179184

180185
```json
181186
Request:
@@ -204,7 +209,7 @@ The high-level steps involved in liveness with verification orchestration are il
204209

205210
```
206211

207-
- The mobile application provides the reference image when initializing the SDK.
212+
- The mobile application provides the reference image when initializing the SDK. This is not a supported scenario in the web solution.
208213

209214
```kotlin
210215
val singleFaceImageSource = VisionSource.fromFile("/path/to/image.jpg")
@@ -227,7 +232,7 @@ The high-level steps involved in liveness with verification orchestration are il
227232
--header 'Content-Type: multipart/form-data' \
228233
--header 'apim-recognition-model-preview-1904: true' \
229234
--header 'Authorization: Bearer.<session-authorization-token> \
230-
--form 'Content=@"video.webp"' \
235+
--form 'Content=@"content.bin"' \
231236
--form 'Metadata="<insert-metadata>"
232237

233238
Response:
@@ -255,7 +260,7 @@ The high-level steps involved in liveness with verification orchestration are il
255260
"width": 409,
256261
"height": 395
257262
},
258-
"fileName": "video.webp",
263+
"fileName": "content.bin",
259264
"timeOffsetWithinFile": 0,
260265
"imageType": "Color"
261266
},
@@ -291,6 +296,7 @@ See the Azure AI Vision SDK reference to learn about other options in the livene
291296

292297
- [Kotlin (Android)](https://aka.ms/liveness-sample-java)
293298
- [Swift (iOS)](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
299+
- [JavaScript (Web)](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-web-readme)
294300

295301
See the Session REST API reference to learn more about the features available to orchestrate the liveness solution.
296302

0 commit comments

Comments
 (0)