You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/Tutorials/liveness.md
+19-13Lines changed: 19 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,17 +28,18 @@ The liveness detection solution successfully defends against various spoof types
28
28
- Once you have your Azure subscription, <ahref="https://portal.azure.com/#create/Microsoft.CognitiveServicesFace"title="Create a Face resource"target="_blank">create a Face resource</a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.
29
29
- You need the key and endpoint from the resource you create to connect your application to the Face service. You'll paste your key and endpoint into the code later in the quickstart.
30
30
- You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
31
-
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android). To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
31
+
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android) and web. To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
32
32
33
33
## Perform liveness detection
34
34
35
-
The liveness solution integration involves two different components: a mobile application and an app server/orchestrator.
35
+
The liveness solution integration involves two different components: a frontend mobile/web application and an app server/orchestrator.
36
36
37
37
### Integrate liveness into mobile application
38
38
39
-
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports both Java/Kotlin for Android and Swift for iOS mobile applications:
39
+
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
40
40
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
41
41
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
42
+
- For JavaScript Web, follow the instructions in the [Web sample](https://aka.ms/liveness-sample-web)
42
43
43
44
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
44
45
@@ -48,9 +49,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
48
49
49
50
:::image type="content" source="../media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="../media/liveness/liveness-diagram.jpg":::
50
51
51
-
1. The mobile application starts the liveness check and notifies the app server.
52
+
1. The frontend application starts the liveness check and notifies the app server.
52
53
53
-
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token.
54
+
1. The app server creates a new liveness session with Azure AI Face Service. The service creates a liveness-session and responds back with a session-authorization-token. More information regarding each request parameter involved in creating a liveness session is referenced in [Liveness Create Session Operation](https://aka.ms/face-api-reference-createlivenesssession).
54
55
55
56
```json
56
57
Request:
@@ -70,9 +71,9 @@ The high-level steps involved in liveness orchestration are illustrated below:
70
71
}
71
72
```
72
73
73
-
1. The app server provides the session-authorization-token back to the mobile application.
74
+
1. The app server provides the session-authorization-token back to the frontend application.
74
75
75
-
1. The mobile application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
76
+
1. The frontend application provides the session-authorization-token during the Azure AI Vision SDK’s initialization.
1. The SDK then starts the camera, guides the user to position correctly and then prepares the payload to call the liveness detection service endpoint.
88
93
89
94
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the mobile application that the liveness check has been completed.
90
95
91
-
1. The mobile application relays the liveness check completion to the app server.
96
+
1. The frontend application relays the liveness check completion to the app server.
92
97
93
98
1. The app server can now query for the liveness detection result from the Azure AI Vision Face service.
94
99
@@ -122,7 +127,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
122
127
"width": 409,
123
128
"height": 395
124
129
},
125
-
"fileName": "video.webp",
130
+
"fileName": "content.bin",
126
131
"timeOffsetWithinFile": 0,
127
132
"imageType": "Color"
128
133
},
@@ -175,7 +180,7 @@ Use the following tips to ensure that your input images give the most accurate r
175
180
176
181
The high-level steps involved in liveness with verification orchestration are illustrated below:
177
182
1. Provide the verification reference image by either of the following two methods:
178
-
- The app server provides the reference image when creating the liveness session.
183
+
- The app server provides the reference image when creating the liveness session. More information regarding each request parameter involved in creating a liveness session with verification is referenced in [Liveness With Verify Create Session Operation](https://aka.ms/face-api-reference-createlivenesswithverifysession).
179
184
180
185
```json
181
186
Request:
@@ -204,7 +209,7 @@ The high-level steps involved in liveness with verification orchestration are il
204
209
205
210
```
206
211
207
-
- The mobile application provides the reference image when initializing the SDK.
212
+
- The mobile application provides the reference image when initializing the SDK. This is not a supported scenario in the web solution.
208
213
209
214
```kotlin
210
215
val singleFaceImageSource = VisionSource.fromFile("/path/to/image.jpg")
@@ -227,7 +232,7 @@ The high-level steps involved in liveness with verification orchestration are il
Copy file name to clipboardExpand all lines: articles/ai-services/disable-local-auth.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ You can use PowerShell to determine whether the local authentication policy is c
27
27
28
28
## Re-enable local authentication
29
29
30
-
To enable local authentication, execute the PowerShell cmdlet **[Set-AzCognitiveServicesAccount](/powershell/module/az.cognitiveservices/set-azcognitiveservicesaccount)** with the parameter `-DisableLocalAuth false`. Allow a few minutes for the service to accept the change to allow local authentication requests.
30
+
To enable local authentication, execute the PowerShell cmdlet **[Set-AzCognitiveServicesAccount](/powershell/module/az.cognitiveservices/set-azcognitiveservicesaccount)** with the parameter `-DisableLocalAuth $false`. Allow a few minutes for the service to accept the change to allow local authentication requests.
31
31
32
32
## Next steps
33
33
-[Authenticate requests to Azure AI services](./authentication.md)
> * There are separate URLs for Document Intelligence Studio sovereign cloud regions.
36
+
> * Azure for US Government: [Document Intelligence Studio (Azure Fairfax cloud)](https://formrecognizer.appliedai.azure.us/studio)
37
+
> * Microsoft Azure operated by 21Vianet: [Document Intelligence Studio (Azure in China)](https://formrecognizer.appliedai.azure.cn/studio)
38
+
33
39
[Document Intelligence Studio](https://documentintelligence.ai.azure.com/) is an online tool for visually exploring, understanding, and integrating features from the Document Intelligence service into your applications. Use the Document Intelligence Studio to:
34
40
35
41
* Learn more about the different capabilities in Document Intelligence.
- Re-create a custom project with the migrated Document Intelligence resource and specify the same storage account.
258
258
259
+
- question: |
260
+
Are there separate URL endpoints for Document Intelligence sovereign cloud regions?
261
+
answer: |
262
+
263
+
"Yes. Document Intelligence Studio has separate URL endpoints for sovereign cloud regions:"
264
+
265
+
- "URL for the Azure US Government cloud (Azure Fairfax): [Document Intelligence Studio US Government](https://formrecognizer.appliedai.azure.us/studio)".
266
+
267
+
- "URL Microsoft Azure operated by 21Vianet (Azure in China): [Document Intelligence Studio China](https://formrecognizer.appliedai.azure.cn/studio)".
0 commit comments