You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/Tutorials/liveness.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,9 +12,9 @@ ms.date: 11/06/2023
12
12
13
13
# Tutorial: Detect liveness in faces
14
14
15
-
Face Liveness detection can be used to determine if a face in an input video stream is real (live) or fake (spoofed). It is an important building block in a biometric authentication system to prevent imposters from gaining access to the system using a photograph, video, mask, or other means to impersonate another person.
15
+
Face Liveness detection can be used to determine if a face in an input video stream is real (live) or fake (spoofed). It's an important building block in a biometric authentication system to prevent imposters from gaining access to the system using a photograph, video, mask, or other means to impersonate another person.
16
16
17
-
The goal of liveness detection is to ensure that the system is interacting with a physically present live person at the time of authentication. Such systems have become increasingly important with the rise of digital finance, remote access control, and online identity verification processes.
17
+
The goal of liveness detection is to ensure that the system is interacting with a physically present live person at the time of authentication. Such systems are increasingly important with the rise of digital finance, remote access control, and online identity verification processes.
18
18
19
19
The Azure AI Face liveness detection solution successfully defends against various spoof types ranging from paper printouts, 2d/3d masks, and spoof presentations on phones and laptops. Liveness detection is an active area of research, with continuous improvements being made to counteract increasingly sophisticated spoofing attacks over time. Continuous improvements will be rolled out to the client and the service components over time as the overall solution gets more robust to new types of attacks.
20
20
@@ -34,7 +34,7 @@ The liveness solution integration involves two distinct components: a frontend m
34
34
Additionally, we combine face verification with liveness detection to verify whether the person is the specific person you designated. The following table help describe details of the liveness detection features:
35
35
36
36
| Feature | Description |
37
-
| -- | -- |
37
+
| -- |--|
38
38
| Liveness detection | Determine an input is real or fake, and only the app server has the authority to start the liveness check and query the result. |
39
39
| Liveness detection with face verification | Determine an input is real or fake and verify the identity of the person based on a reference image you provided. Either the app server or the frontend application can provide a reference image. Only the app server has the authority to initial the liveness check and query the result. |
40
40
@@ -52,14 +52,14 @@ This tutorial demonstrates how to operate a frontend application and an app serv
52
52
53
53
## Set up frontend applications and app servers to perform liveness detection
54
54
55
-
We provide SDKs in different languages for frontend applications and app servers. See the following instructions to setup your frontend applications and app servers.
55
+
We provide SDKs in different languages for frontend applications and app servers. See the following instructions to set up your frontend applications and app servers.
56
56
57
57
Once you have access to the SDK, follow instructions in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
58
58
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
59
59
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
60
60
- For JavaScript Web, follow the instructions in the [Web sample](https://aka.ms/liveness-sample-web)
61
61
62
-
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
62
+
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user in adjusting their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
63
63
64
64
### Download Azure AI Face client library for an app server
65
65
@@ -226,7 +226,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
0 commit comments