Skip to content

Commit a78c3bb

Browse files
committed
resolve conflicts
2 parents 16f7c8d + ba49598 commit a78c3bb

File tree

932 files changed

+14022
-7180
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

932 files changed

+14022
-7180
lines changed

.openpublishing.publish.config.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -253,7 +253,7 @@
253253
"url": "https://github.com/Azure-Samples/function-app-arm-templates",
254254
"branch": "main",
255255
"branch_mapping": {}
256-
},
256+
},
257257
{
258258
"path_to_root": "functions-azure-product",
259259
"url": "https://github.com/Azure/Azure-Functions",
@@ -1355,6 +1355,7 @@
13551355
"articles/object-anchors/.openpublishing.redirection.object-anchors.json",
13561356
"articles/operator-insights/.openpublishing.redirection.operator-insights.json",
13571357
"articles/operator-nexus/.openpublishing.redirection.operator-nexus.json",
1358+
"articles/operator-service-manager/.openpublishing.redirection.operator-service-manager.json",
13581359
"articles/peering-service/.openpublishing.redirection.peering-service.json",
13591360
"articles/postgresql/.openpublishing.redirection.postgresql.json",
13601361
"articles/route-server/.openpublishing.redirection.route-server.json",

.openpublishing.redirection.azure-monitor.json

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6649,9 +6649,14 @@
66496649
"redirect_url": "/azure/azure-functions/functions-monitoring",
66506650
"redirect_document_id": false
66516651
},
6652+
{
6653+
"source_path_from_root": "/articles/azure-monitor/app/app-insights-azure-ad-api.md",
6654+
"redirect_url": "/azure/azure-monitor/app/azure-ad-authentication",
6655+
"redirect_document_id": false
6656+
},
66526657
{
66536658
"source_path_from_root": "/articles/azure-monitor/app/resources-roles-access-control.md",
6654-
"redirect_url": "/azure/azure-monitor//roles-permissions-security",
6659+
"redirect_url": "/azure/azure-monitor/roles-permissions-security",
66556660
"redirect_document_id": false
66566661
},
66576662
{

.openpublishing.redirection.json

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1870,6 +1870,14 @@
18701870
"redirect_url": "/azure/jenkins/tutorial-jenkins-deploy-web-app-azure-app-service",
18711871
"redirect_document_id": false
18721872
},
1873+
{
1874+
"source_path_from_root": "/articles/app-spaces/quickstart-deploy-web-app.md",
1875+
"redirect_url": "/azure/app-spaces/quickstart-deploy-sample-app"
1876+
},
1877+
{
1878+
"source_path_from_root": "/articles/app-spaces/deploy-app-spaces-template.md",
1879+
"redirect_url": "/azure/app-spaces/quickstart-deploy-your-app"
1880+
},
18731881
{
18741882
"source_path_from_root": "/articles/application-insights/app-insights-analytics-reference.md",
18751883
"redirect_url": "/azure/kusto/query/",
@@ -3625,6 +3633,11 @@
36253633
"redirect_url": "https://www.twilio.com/docs/usage/tutorials/serverless-webhooks-azure-functions-and-csharp",
36263634
"redirect_document_id": false
36273635
},
3636+
{
3637+
"source_path_from_root": "/articles/azure-functions/create-first-function-vs-code-web.md",
3638+
"redirect_url": "/azure/azure-functions",
3639+
"redirect_document_id": false
3640+
},
36283641
{
36293642
"source_path_from_root": "/articles/twilio-dotnet-how-to-use-for-voice-sms.md",
36303643
"redirect_url": "https://www.twilio.com/docs/usage/tutorials/serverless-webhooks-azure-functions-and-csharp",
@@ -4049,6 +4062,16 @@
40494062
"source_path_from_root":"/articles/cosmos-db/high-availability.md",
40504063
"redirect_url":"/azure/reliability/reliability-cosmos-db-nosql.md",
40514064
"redirect_document_id":false
4065+
},
4066+
{
4067+
"source_path_from_root":"/articles/migrate/how-to-assess.md",
4068+
"redirect_url":"/azure/migrate/whats-new#update-april-2024",
4069+
"redirect_document_id":false
4070+
},
4071+
{
4072+
"source_path_from_root":"/articles/migrate/prepare-isv-movere.md",
4073+
"redirect_url":"/azure/migrate/whats-new#update-april-2024",
4074+
"redirect_document_id":false
40524075
}
40534076
]
40544077
}
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
{
2+
"redirections": [
3+
{
4+
"source_path_from_root": "/articles/operator-service-manager/how-to-use-azure-operator-service-manager-cli-extension.md",
5+
"redirect_url": "/azure/operator-service-manager/concepts-about-azure-operator-service-manager-cli",
6+
"redirect_document_id": false
7+
}
8+
]
9+
}

articles/ai-services/computer-vision/Tutorials/liveness.md

Lines changed: 19 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -28,17 +28,18 @@ The liveness detection solution successfully defends against various spoof types
2828
- Once you have your Azure subscription, <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesFace" title="Create a Face resource" target="_blank">create a Face resource</a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.
2929
- You need the key and endpoint from the resource you create to connect your application to the Face service. You'll paste your key and endpoint into the code later in the quickstart.
3030
- You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
31-
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android). To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
31+
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android) and web. To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
3232

3333
## Perform liveness detection
3434

35-
The liveness solution integration involves two different components: a mobile application and an app server/orchestrator.
35+
The liveness solution integration involves two different components: a frontend mobile/web application and an app server/orchestrator.
3636

3737
### Integrate liveness into mobile application
3838

39-
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications:
39+
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
4040
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
4141
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
42+
- For JavaScript Web, follow the instructions in the [Web sample](https://aka.ms/liveness-sample-web)
4243

4344
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
4445

@@ -48,9 +49,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
4849

4950
:::image type="content" source="../media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="../media/liveness/liveness-diagram.jpg":::
5051

51-
1. The mobile application starts the liveness check and notifies the app server.
52+
1. The frontend application starts the liveness check and notifies the app server.
5253

53-
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token.
54+
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token. More information regarding each request parameter involved in creating a liveness session is referenced in [Liveness Create Session Operation](https://aka.ms/face-api-reference-createlivenesssession).
5455

5556
```json
5657
Request:
@@ -70,9 +71,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
7071
}
7172
```
7273

73-
1. The app server provides the session-authorization-token back to the mobile application.
74+
1. The app server provides the session-authorization-token back to the frontend application.
7475

75-
1. The mobile application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
76+
1. The frontend application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
7677

7778
```kotlin
7879
mServiceOptions?.setTokenCredential(com.azure.android.core.credential.TokenCredential { _, callback ->
@@ -84,11 +85,15 @@ The high-level steps involved in liveness orchestration are illustrated below:
8485
serviceOptions?.authorizationToken = "<INSERT_TOKEN_HERE>"
8586
```
8687

88+
```javascript
89+
azureAIVisionFaceAnalyzer.token = "<INSERT_TOKEN_HERE>"
90+
```
91+
8792
1. The SDK then starts the camera, guides the user to position correctly and then prepares the payload to call the liveness detection service endpoint.
8893

8994
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the mobile application that the liveness check has been completed.
9095

91-
1. The mobile application relays the liveness check completion to the app server.
96+
1. The frontend application relays the liveness check completion to the app server.
9297

9398
1. The app server can now query for the liveness detection result from the Azure AI Vision Face service.
9499

@@ -122,7 +127,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
122127
"width": 409,
123128
"height": 395
124129
},
125-
"fileName": "video.webp",
130+
"fileName": "content.bin",
126131
"timeOffsetWithinFile": 0,
127132
"imageType": "Color"
128133
},
@@ -175,7 +180,7 @@ Use the following tips to ensure that your input images give the most accurate r
175180

176181
The high-level steps involved in liveness with verification orchestration are illustrated below:
177182
1. Provide the verification reference image by either of the following two methods:
178-
- The app server provides the reference image when creating the liveness session.
183+
- The app server provides the reference image when creating the liveness session. More information regarding each request parameter involved in creating a liveness session with verification is referenced in [Liveness With Verify Create Session Operation](https://aka.ms/face-api-reference-createlivenesswithverifysession).
179184

180185
```json
181186
Request:
@@ -204,7 +209,7 @@ The high-level steps involved in liveness with verification orchestration are il
204209

205210
```
206211

207-
- The mobile application provides the reference image when initializing the SDK.
212+
- The mobile application provides the reference image when initializing the SDK. This is not a supported scenario in the web solution.
208213

209214
```kotlin
210215
val singleFaceImageSource = VisionSource.fromFile("/path/to/image.jpg")
@@ -227,7 +232,7 @@ The high-level steps involved in liveness with verification orchestration are il
227232
--header 'Content-Type: multipart/form-data' \
228233
--header 'apim-recognition-model-preview-1904: true' \
229234
--header 'Authorization: Bearer.<session-authorization-token> \
230-
--form 'Content=@"video.webp"' \
235+
--form 'Content=@"content.bin"' \
231236
--form 'Metadata="<insert-metadata>"
232237

233238
Response:
@@ -255,7 +260,7 @@ The high-level steps involved in liveness with verification orchestration are il
255260
"width": 409,
256261
"height": 395
257262
},
258-
"fileName": "video.webp",
263+
"fileName": "content.bin",
259264
"timeOffsetWithinFile": 0,
260265
"imageType": "Color"
261266
},
@@ -291,6 +296,7 @@ See the Azure AI Vision SDK reference to learn about other options in the livene
291296

292297
- [Kotlin (Android)](https://aka.ms/liveness-sample-java)
293298
- [Swift (iOS)](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
299+
- [JavaScript (Web)](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-web-readme)
294300

295301
See the Session REST API reference to learn more about the features available to orchestrate the liveness solution.
296302

articles/ai-services/computer-vision/includes/quickstarts-sdk/identity-python-sdk.md

Lines changed: 24 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -15,12 +15,12 @@ ms.author: pafarley
1515

1616
Get started with facial recognition using the Face client library for Python. Follow these steps to install the package and try out the example code for basic tasks. The Face service provides you with access to advanced algorithms for detecting and recognizing human faces in images. Follow these steps to install the package and try out the example code for basic face identification using remote images.
1717

18-
[Reference documentation](/python/api/overview/azure/cognitiveservices/face-readme) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/cognitiveservices/azure-cognitiveservices-vision-face) | [Package (PiPy)](https://pypi.org/project/azure-cognitiveservices-vision-face/) | [Samples](/samples/browse/?products=azure&term=face)
18+
[Reference documentation](/python/api/azure-ai-vision-face/azure.ai.vision.face) | [Library source code](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/face/azure-ai-vision-face/azure/ai/vision/face) | [Package (PiPy)](https://aka.ms/azsdk-python-face-pkg) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/face/azure-ai-vision-face/samples)
1919

2020
## Prerequisites
2121

2222
* Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services/)
23-
* [Python 3.x](https://www.python.org/)
23+
* [Python 3.8+](https://www.python.org/)
2424
* Your Python installation should include [pip](https://pip.pypa.io/en/stable/). You can check if you have pip installed by running `pip --version` on the command line. Get pip by installing the latest version of Python.
2525
* [!INCLUDE [contributor-requirement](../../../includes/quickstarts/contributor-requirement.md)]
2626
* Once you have your Azure subscription, <a href="https://portal.azure.com/#create/Microsoft.CognitiveServicesFace" title="Create a Face resource" target="_blank">create a Face resource</a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.
@@ -38,7 +38,7 @@ Get started with facial recognition using the Face client library for Python. Fo
3838
After installing Python, you can install the client library with:
3939

4040
```console
41-
pip install --upgrade azure-cognitiveservices-vision-face
41+
python -m pip install azure-ai-vision-face
4242
```
4343

4444
1. Create a new Python application
@@ -65,25 +65,28 @@ Get started with facial recognition using the Face client library for Python. Fo
6565
## Output
6666

6767
```console
68-
Person group: c8e679eb-0b71-43b4-aa91-ab8200cae7df
69-
face 861d769b-d014-40e8-8b4a-7fd3bc9b425b added to person f80c1cfa-b8cb-46f8-9f7f-e72fbe402bc3
70-
face e3c356a4-1ac3-4c97-9219-14648997f195 added to person f80c1cfa-b8cb-46f8-9f7f-e72fbe402bc3
71-
face f9119820-c374-4c4d-b795-96ae2fec5069 added to person be4084a7-0c7b-4cf9-9463-3756d2e28e17
72-
face 67d626df-3f75-4801-9364-601b63c8296a added to person be4084a7-0c7b-4cf9-9463-3756d2e28e17
73-
face 19e2e8cc-5029-4087-bca0-9f94588fb850 added to person 3ff07c65-6193-4d3e-bf18-d7c106393cd5
74-
face dcc61e80-16b1-4241-ae3f-9721597bae4c added to person 3ff07c65-6193-4d3e-bf18-d7c106393cd5
75-
pg resource is c8e679eb-0b71-43b4-aa91-ab8200cae7df
76-
<msrest.pipeline.ClientRawResponse object at 0x00000240DAD47310>
77-
Training status: running.
78-
79-
Training status: succeeded.
80-
68+
Person group: dbd92bf0-8b74-43fc-a27a-b127c1bb1b66
69+
face 1d09b50e-0fb6-430c-a47c-9bb235761c17 added to person ea92a5d5-5250-44db-88fb-3b32e1a1ecaf
70+
face 74e1807a-6c86-4c74-b497-a3bcdda8c631 added to person ea92a5d5-5250-44db-88fb-3b32e1a1ecaf
71+
face 512cc8ff-e18a-4702-9413-3c83af9a0915 added to person f03219b3-c2dc-4ad6-b00b-bd71792686ac
72+
face 899bbe8e-2d03-4941-8221-d087911df21b added to person f03219b3-c2dc-4ad6-b00b-bd71792686ac
73+
face dfc0d142-36b0-4d90-982b-b51570ead5a8 added to person 8697d263-be7b-4d78-ba40-b55305dbbeb6
74+
face 29939a66-9da2-46f2-b572-abbe4e0d754a added to person 8697d263-be7b-4d78-ba40-b55305dbbeb6
75+
Train the person group dbd92bf0-8b74-43fc-a27a-b127c1bb1b66
76+
The person group dbd92bf0-8b74-43fc-a27a-b127c1bb1b66 is trained successfully.
8177
Pausing for 10 seconds to avoid triggering rate limit on free account...
8278
Identifying faces in image
83-
Person for face ID 40582995-d3a8-41c4-a9d1-d17ae6b46c5c is identified in image, with a confidence of 0.96725.
84-
Person for face ID 7a0368a2-332c-4e7a-81c4-2db3d74c78c5 is identified in image, with a confidence of 0.96921.
85-
No person identified for face ID c4a3dd28-ef2d-457e-81d1-a447344242c4 in image.
86-
Person for face ID 360edf1a-1e8f-402d-aa96-1734d0c21c1c is identified in image, with a confidence of 0.92886.
79+
Person is identified for face ID 5779a986-238c-499d-b22a-d2a7cec92e88 in image, with a confidence of 0.96725.
80+
verification result: True. confidence: 0.96725
81+
Person is identified for face ID a28a4997-600e-4595-be39-d7a7d0f8afc8 in image, with a confidence of 0.96921.
82+
verification result: True. confidence: 0.96921
83+
No person identified for face ID 02a56d35-f3a4-43eb-a295-f23a1b772de9 in image.
84+
Person is identified for face ID 5de2019a-c4d3-4021-b8d0-9a3b86adceb7 in image, with a confidence of 0.92886.
85+
verification result: True. confidence: 0.92886
86+
87+
The person group dbd92bf0-8b74-43fc-a27a-b127c1bb1b66 is deleted.
88+
89+
End of quickstart.
8790
```
8891

8992

@@ -95,10 +98,6 @@ If you want to clean up and remove an Azure AI services subscription, you can de
9598
* [Portal](../../../multi-service-resource.md?pivots=azportal#clean-up-resources)
9699
* [Azure CLI](../../../multi-service-resource.md?pivots=azcli#clean-up-resources)
97100

98-
To delete the **PersonGroup** you created in this quickstart, run the following code in your script:
99-
100-
[!code-python[](~/cognitive-services-quickstart-code/python/Face/FaceQuickstart.py?name=snippet_deletegroup)]
101-
102101
## Next steps
103102

104103
In this quickstart, you learned how to use the Face client library for Python to do basic face identification. Next, learn about the different face detection models and how to specify the right model for your use case.
@@ -107,4 +106,4 @@ In this quickstart, you learned how to use the Face client library for Python to
107106
> [Specify a face detection model version](../../how-to/specify-detection-model.md)
108107
109108
* [What is the Face service?](../../overview-identity.md)
110-
* More extensive sample code can be found on [GitHub](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/python/Face/FaceQuickstart.py).
109+
* More extensive sample code can be found on [GitHub](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/face/azure-ai-vision-face/samples).

articles/ai-services/computer-vision/overview-identity.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -91,6 +91,7 @@ Concepts
9191
Face liveness SDK reference docs:
9292
- [Java (Android)](https://aka.ms/liveness-sdk-java)
9393
- [Swift (iOS)](https://aka.ms/liveness-sdk-ios)
94+
- [JavaScript (Web)](https://aka.ms/liveness-sdk-web)
9495

9596
## Face recognition
9697

articles/ai-services/computer-vision/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -378,6 +378,8 @@ items:
378378
href: https://aka.ms/azure-ai-vision-face-liveness-client-sdk-android-api-reference
379379
- name: Swift (iOS)
380380
href: https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-api-reference
381+
- name: JavaScript (Web)
382+
href: https://aka.ms/azure-ai-vision-face-liveness-client-sdk-web-api-reference
381383
- name: Video Analysis
382384
items:
383385
- name: Video Analysis overview

0 commit comments

Comments
 (0)