You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The liveness solution integration involves two distinct components: a frontend mobile/web application and an app server/orchestrator.
27
27
28
+
:::image type="content" source="../media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="../media/liveness/liveness-diagram.jpg":::
29
+
28
30
-**Frontend application**: The frontend application receives authorization from the app server to initiate liveness detection. Its primary objective is to activate the camera and guide end-users accurately through the liveness detection process.
29
31
-**App server**: The app server serves as a backend server to create liveness detection sessions and obtain an authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
30
32
@@ -35,9 +37,12 @@ Additionally, we combine face verification with liveness detection to verify whe
35
37
| Liveness detection | Determine an input is real or fake, and only the app server has the authority to start the liveness check and query the result. |
36
38
| Liveness detection with face verification | Determine an input is real or fake and verify the identity of the person based on a reference image you provided. Either the app server or the frontend application can provide a reference image. Only the app server has the authority to initial the liveness check and query the result. |
37
39
38
-
This tutorial demonstrates how to operate a frontend application and an app server to perform liveness detection across various language SDKs.
39
40
40
-
## Prerequisites
41
+
## Get started
42
+
43
+
This tutorial demonstrates how to operate a frontend application and an app server to perform [liveness detection](#perform-liveness-detection) and [liveness detection with face verification](#perform-liveness-detection-with-face-verification) across various language SDKs.
44
+
45
+
### Prerequisites
41
46
42
47
- Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services/)
43
48
- Your Azure account must have a **Cognitive Services Contributor** role assigned in order for you to agree to the responsible AI terms and create a resource. To get this role assigned to your account, follow the steps in the [Assign roles](/azure/role-based-access-control/role-assignments-steps) documentation, or contact your administrator.
@@ -46,7 +51,7 @@ This tutorial demonstrates how to operate a frontend application and an app serv
46
51
- You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
47
52
- Access to the Azure AI Vision Face Client SDK for mobile (IOS and Android) and web. To get started, you need to apply for the [Face Recognition Limited Access features](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUQjA5SkYzNDM4TkcwQzNEOE1NVEdKUUlRRCQlQCN0PWcu) to get access to the SDK. For more information, see the [Face Limited Access](/legal/cognitive-services/computer-vision/limited-access-identity?context=%2Fazure%2Fcognitive-services%2Fcomputer-vision%2Fcontext%2Fcontext) page.
48
53
49
-
## Setup frontend applications and app servers to perform liveness detection
54
+
###Setup frontend applications and app servers to perform liveness detection
50
55
51
56
We provide SDKs in different languages for frontend applications and app servers. They're available in these languages:
52
57
@@ -60,7 +65,7 @@ We provide SDKs in different languages for frontend applications and app servers
60
65
| Kotlin |-|✔|
61
66
| Swift |-|✔|
62
67
63
-
### Integrate liveness into mobile application
68
+
####Integrate liveness into mobile application
64
69
65
70
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
66
71
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
@@ -69,19 +74,19 @@ Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk]
69
74
70
75
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
71
76
72
-
### Download Azure AI Face client library for an app server
77
+
####Download Azure AI Face client library for an app server
73
78
74
79
The app server/orchestrator is responsible for controlling the lifecycle of a liveness session. The app server has to create a session before performing liveness detection, and then it can query the result and delete the session when the liveness check is finished. We offer a library in various languages for easily implementing your app server. Follow these steps to install the package you want:
75
80
- For C#, follow the instructions in the [dotnet readme](https://github.com/Azure/azure-sdk-for-net/tree/main/sdk/face/Azure.AI.Vision.Face/README.md)
76
81
- For Java, follow the instructions in the [Java readme](https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/face/azure-ai-vision-face/README.md)
77
82
- For Python, follow the instructions in the [Python readme](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/face/azure-ai-vision-face/README.md)
78
83
- For JavaScript, follow the instructions in the [JavaScript readme](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/face/ai-vision-face-rest/README.md)
@@ -421,15 +426,15 @@ There are two parts to integrating liveness with verification:
421
426
422
427
:::imagetype="content"source="../media/liveness/liveness-verify-diagram.jpg"alt-text="Diagram of the liveness-with-face-verification workflow of Azure AI Face."lightbox="../media/liveness/liveness-verify-diagram.jpg":::
0 commit comments