Skip to content

Commit 676da06

Browse files
committed
drafting - add all faq content
1 parent 41deb4a commit 676da06

File tree

2 files changed

+66
-27
lines changed

2 files changed

+66
-27
lines changed

articles/ai-services/computer-vision/concept-face-liveness-detection.md

Lines changed: 37 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -16,25 +16,46 @@ feedback_help_link_url: https://learn.microsoft.com/answers/tags/156/azure-face
1616

1717
# Face liveness detection
1818

19-
This article explains the concept of Face liveness detection, its input and output schema, and related concepts.
19+
This article explains the concept of Face liveness detection, its input and output schema, and related concepts.
20+
21+
## What it does
22+
23+
Face Liveness detection is used to determine if a face in an input video stream is real (live) or fake (spoofed). It's an important building block in a biometric authentication system to prevent imposters from gaining access to the system using a photograph, video, mask, or other means to impersonate another person.
24+
25+
The goal of liveness detection is to ensure that the system is interacting with a physically present, live person at the time of authentication. These systems are increasingly important with the rise of digital finance, remote access control, and online identity verification processes.
26+
27+
The Azure AI Face liveness detection solution successfully defends against various spoof types ranging from paper printouts, 2D/3D masks, and spoof presentations on phones and laptops. Liveness detection is an active area of research, with continuous improvements being made to counteract increasingly sophisticated spoofing attacks. Continuous improvements are rolled out to the client and the service components over time as the overall solution gets more robust to new types of attacks.
28+
29+
The Azure Face liveness detection API is [conformant to ISO/IEC 30107-3 PAD (Presentation Attack Detection) standards](https://www.ibeta.com/wp-content/uploads/2023/11/230622-Microsoft-PAD-Level-2-Confirmation-Letter.pdf) as validated by iBeta level 1 and level 2 conformance testing.
2030

2131
## How it works
2232

23-
TBD
33+
The liveness solution integration involves two distinct components: a frontend mobile/web application and an app server/orchestrator.
34+
35+
:::image type="content" source="./media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="./media/liveness/liveness-diagram.jpg":::
2436

25-
The Face Liveness Check is conformant to ISO/IEC 30107-3 PAD (Presentation Attack Detection) standards as validated by iBeta level 1 and level 2 conformance testing; the report is [here](https://www.ibeta.com/wp-content/uploads/2023/11/230622-Microsoft-PAD-Level-2-Confirmation-Letter.pdf).
37+
- **Frontend application**: The frontend application receives authorization from the app server to initiate liveness detection. Its primary objective is to activate the camera and guide end-users accurately through the liveness detection process.
38+
- **App server**: The app server serves as a backend server to create liveness detection sessions and obtain an authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
2639

2740

2841
## Liveness detection modes
2942

30-
Azure Face liveness detection API includes the options for both passive and passive-active detection modes.
43+
Azure Face liveness detection API includes options for both Passive and Passive-Active detection modes.
3144

32-
The **Passive mode** requires a non-bright lighting environment to succeed and will fail in bright lighting environments with an "Environment not supported" error. This mode can be chosen if you prefer minimal end-user interaction and expect end-users to primarily be in non-bright environments. This mode utilizes a passive liveness technique that requires no additional actions from the user. It also requires high screen brightness for optimal performance which is configured automatically in the Mobile (iOS and Android) solutions.
45+
The **Passive mode** utilizes a passive liveness technique that requires no additional actions from the user. It requires a non-bright lighting environment to succeed and will fail in bright lighting environments with an "Environment not supported" error. It also requires high screen brightness for optimal performance which is configured automatically in the Mobile (iOS and Android) solutions. This mode can be chosen if you prefer minimal end-user interaction and expect end-users to primarily be in non-bright environments. A Passive mode check takes around 12 seconds on an average to complete.
3346

34-
The **Passive-Active mode** will still behave the same as the Passive mode in non-bright lighting environments and only trigger the Active mode in bright lighting environments. This mode can be chosen if you want the liveness-check to work in any lighting environment. This mode is preferable on Web browser solutions due to the lack of automatic screen brightness control available on browsers which hinders the Passive mode's operational envelope.
47+
The **Passive-Active mode** will behave the same as the Passive mode in non-bright lighting environments and only trigger the Active mode in bright lighting environments. This mode is preferable on Web browser solutions due to the lack of automatic screen brightness control available on browsers which hinders the Passive mode's operational envelope. This mode can be chosen if you want the liveness-check to work in any lighting environment. If the Active check is triggered due to a bright lighting environment, then the total completion time may take up to 20 seconds on average.
3548

36-
This setting can be set during the Session-Creation step (see step 2 of [Perform liveness detection](#perform-liveness-detection)).
49+
You can set the detection mode during the session creation step (see [Perform liveness detection](./tutorials/liveness.md#perform-liveness-detection)).
3750

51+
## Optional face verification
52+
53+
You can combine face verification with liveness detection to verify that the face in question belongs to the particular person designated. The following table describes details of the liveness detection features:
54+
55+
| Feature | Description |
56+
| -- |--|
57+
| Liveness detection | Determine an input is real or fake, and only the app server has the authority to start the liveness check and query the result. |
58+
| Liveness detection with face verification | Determine an input is real or fake and verify the identity of the person based on a reference image you provided. Either the app server or the frontend application can provide a reference image. Only the app server has the authority to initial the liveness check and query the result. |
3859

3960
## Input requirements
4061

@@ -48,13 +69,22 @@ TBD
4869

4970
We do not store any images or videos from the Face Liveness Check. No image/video data is stored in the liveness service after the liveness session has been concluded. Moreover, the image/video uploaded during the liveness check is only used to perform the liveness classification to determine if the user is real or a spoof (and optionally to perform a match against a reference image in the liveness-with-verify-scenario), and it cannot be viewed by any human and will not be used for any AI model improvements.
5071

72+
TBD
73+
#### - Do you include any runtime application self-protections (RASP)? (concept)
74+
75+
Yes, we include additional RASP protections on our Mobile SDKs (iOS and Android) provided by [GuardSquare]( https://www.guardsquare.com/blog/why-guardsquare).
76+
5177
## Output format
5278

5379
The liveness detection API returns a JSON object with the following information:
5480
- A Real or a Spoof Face Liveness Decision. We handle the underlying accuracy and thresholding, so you don’t have to worry about interpreting “confidence scores” or making inferences yourself. This makes integration easier and more seamless for developers.
5581
- Optionally a Face Verification result can be obtained if the liveness check is performed with verification (see [Perform liveness detection with face verification](#perform-liveness-detection-with-face-verification)).
5682
- A quality filtered "session-image" that can be used to store for auditing purposes or for human review or to perform further analysis using the Face service APIs.
5783

84+
## Support options
85+
86+
In addition to using the main [Azure AI services support options](../../cognitive-services-support-options.md), you can also post your questions in the [issues](https://github.com/Azure-Samples/azure-ai-vision-sdk/issues) section of the SDK repo.
87+
5888

5989
## Next steps
6090

articles/ai-services/computer-vision/tutorials/liveness.md

Lines changed: 29 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -14,31 +14,15 @@ feedback_help_link_url: https://learn.microsoft.com/answers/tags/156/azure-face
1414

1515
# Tutorial: Detect liveness in faces
1616

17-
Face Liveness detection is used to determine if a face in an input video stream is real (live) or fake (spoofed). It's an important building block in a biometric authentication system to prevent imposters from gaining access to the system using a photograph, video, mask, or other means to impersonate another person.
18-
19-
The goal of liveness detection is to ensure that the system is interacting with a physically present, live person at the time of authentication. These systems are increasingly important with the rise of digital finance, remote access control, and online identity verification processes.
20-
21-
The Azure AI Face liveness detection solution successfully defends against various spoof types ranging from paper printouts, 2D/3D masks, and spoof presentations on phones and laptops. Liveness detection is an active area of research, with continuous improvements being made to counteract increasingly sophisticated spoofing attacks. Continuous improvements are rolled out to the client and the service components over time as the overall solution gets more robust to new types of attacks.
17+
In this tutorial, you learn how to detect liveness in faces, using a combination of server-side code and a client-side mobile application. For general information about face liveness detection, see the [conceptual guide](../concept-face-liveness-detection.md).
2218

2319
[!INCLUDE [liveness-sdk-gate](../includes/liveness-sdk-gate.md)]
2420

25-
## Introduction
26-
27-
The liveness solution integration involves two distinct components: a frontend mobile/web application and an app server/orchestrator.
28-
29-
:::image type="content" source="../media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="../media/liveness/liveness-diagram.jpg":::
30-
31-
- **Frontend application**: The frontend application receives authorization from the app server to initiate liveness detection. Its primary objective is to activate the camera and guide end-users accurately through the liveness detection process.
32-
- **App server**: The app server serves as a backend server to create liveness detection sessions and obtain an authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
33-
34-
Additionally, we combine face verification with liveness detection to verify whether the person is the specific person you designated. The following table describes details of the liveness detection features:
21+
This tutorial demonstrates how to operate a frontend application and an app server to perform [liveness detection](#perform-liveness-detection), including the optional step of [face verification](#perform-liveness-detection-with-face-verification), across various language SDKs.
3522

36-
| Feature | Description |
37-
| -- |--|
38-
| Liveness detection | Determine an input is real or fake, and only the app server has the authority to start the liveness check and query the result. |
39-
| Liveness detection with face verification | Determine an input is real or fake and verify the identity of the person based on a reference image you provided. Either the app server or the frontend application can provide a reference image. Only the app server has the authority to initial the liveness check and query the result. |
4023

41-
This tutorial demonstrates how to operate a frontend application and an app server to perform [liveness detection](#perform-liveness-detection) and [liveness detection with face verification](#perform-liveness-detection-with-face-verification) across various language SDKs.
24+
> [!TIP]
25+
> After you complete the prerequisites, you can get started faster by building and running a complete frontend sample (either on iOS, Android, or Web) from the [SDK samples folder](https://github.com/Azure-Samples/azure-ai-vision-sdk/tree/main/samples).
4226
4327
## Prerequisites
4428

@@ -62,6 +46,11 @@ Once you have access to the SDK, follow instructions in the [azure-ai-vision-sdk
6246

6347
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user in adjusting their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
6448

49+
> [!TIP]
50+
> SDK versions
51+
>
52+
> You can monitor the [Releases section](https://github.com/Azure-Samples/azure-ai-vision-sdk/releases) of the SDK repo for new SDK version updates.
53+
6554
### Download Azure AI Face client library for app server
6655

6756
The app server/orchestrator is responsible for controlling the lifecycle of a liveness session. The app server has to create a session before performing liveness detection, and then it can query the result and delete the session when the liveness check is finished. We offer a library in various languages for easily implementing your app server. Follow these steps to install the package you want:
@@ -802,13 +791,32 @@ The high-level steps involved in liveness with verification orchestration are il
802791

803792
---
804793

794+
## (optional) do additional face tbd
795+
796+
Optionally, you can do further face identity operations after the liveness check.
797+
798+
TBD Can I perform further face analysis (e.g. age) and/or face identity operations along with a Face Liveness Check? (tutorialat the end as a next step to do more face analysis)
799+
800+
Yes, you can.
801+
802+
- To enable this, you will need to set the "enableSessionImage" parameter to "true" during the Session-Creation step (see step 2 of [Perform liveness detection](#perform-liveness-detection)).
803+
804+
- After the session completes, you can extract the "sessionImageId" from the Session-Get-Result step (see step 8 of [Perform liveness detection](#perform-liveness-detection)).
805+
806+
- You can now either download the session-image (referenced in [Liveness Get Session Image Operation](/rest/api/face/liveness-session-operations/get-session-image)), or provide "sessionImageId" in the [/detect](/rest/api/face/face-detection-operations/detect-from-session-image-id) operation to continue to perform other face analysis or face identity operations (referenced in [Concept Face Detection](../concept-face-detection.md) and [Concept Face Recognition](../concept-face-recognition.md)).
807+
808+
805809
## Clean up resources
806810

807811
If you want to clean up and remove an Azure AI services subscription, you can delete the resource or resource group. Deleting the resource group also deletes any other resources associated with it.
808812

809813
* [Azure portal](../../multi-service-resource.md?pivots=azportal#clean-up-resources)
810814
* [Azure CLI](../../multi-service-resource.md?pivots=azcli#clean-up-resources)
811815

816+
## Support options
817+
818+
In addition to using the main [Azure AI services support options](../../cognitive-services-support-options.md), you can also post your questions in the [issues](https://github.com/Azure-Samples/azure-ai-vision-sdk/issues) section of the SDK repo.
819+
812820
## Related content
813821

814822
To learn about other options in the liveness APIs, see the Azure AI Vision SDK reference.
@@ -820,3 +828,4 @@ To learn about other options in the liveness APIs, see the Azure AI Vision SDK r
820828
To learn more about the features available to orchestrate the liveness solution, see the Session REST API reference.
821829

822830
- [Liveness Session Operations](/rest/api/face/liveness-session-operations)
831+

0 commit comments

Comments
 (0)