You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This article explains the concept of Face liveness detection, its input and output schema, and related concepts.
20
20
21
-
## What it does
21
+
## Introduction
22
22
23
23
Face Liveness detection is used to determine if a face in an input video stream is real (live) or fake (spoofed). It's an important building block in a biometric authentication system to prevent imposters from gaining access to the system using a photograph, video, mask, or other means to impersonate another person.
24
24
@@ -61,18 +61,9 @@ You can combine face verification with liveness detection to verify that the fac
61
61
62
62
Use the following tips to ensure that your input images give the most accurate recognition results:
We do not store any images or videos from the Face Liveness Check. No image/video data is stored in the liveness service after the liveness session has been concluded. Moreover, the image/video uploaded during the liveness check is only used to perform the liveness classification to determine if the user is real or a spoof (and optionally to perform a match against a reference image in the liveness-with-verify-scenario), and it cannot be viewed by any human and will not be used for any AI model improvements.
71
-
72
-
TBD
73
-
#### - Do you include any runtime application self-protections (RASP)? (concept)
74
-
75
-
Yes, we include additional RASP protections on our Mobile SDKs (iOS and Android) provided by [GuardSquare](https://www.guardsquare.com/blog/why-guardsquare).
76
67
77
68
## Output format
78
69
@@ -81,13 +72,23 @@ The liveness detection API returns a JSON object with the following information:
81
72
- Optionally a Face Verification result can be obtained if the liveness check is performed with verification (see [Perform liveness detection with face verification](#perform-liveness-detection-with-face-verification)).
82
73
- A quality filtered "session-image" that can be used to store for auditing purposes or for human review or to perform further analysis using the Face service APIs.
83
74
75
+
76
+
### Data privacy
77
+
78
+
We do not store any images or videos from the Face Liveness Check. No image/video data is stored in the liveness service after the liveness session has been concluded. Moreover, the image/video uploaded during the liveness check is only used to perform the liveness classification to determine if the user is real or a spoof (and optionally to perform a match against a reference image in the liveness-with-verify-scenario), and it cannot be viewed by any human and will not be used for any AI model improvements.
79
+
80
+
## Security
81
+
82
+
We include additional runtime application self-protections (RASP), provided by [GuardSquare](https://www.guardsquare.com/blog/why-guardsquare), in our Mobile SDKs (iOS and Android).
83
+
84
84
## Support options
85
85
86
86
In addition to using the main [Azure AI services support options](../../cognitive-services-support-options.md), you can also post your questions in the [issues](https://github.com/Azure-Samples/azure-ai-vision-sdk/issues) section of the SDK repo.
87
87
88
88
89
-
## Next steps
89
+
## Next step
90
90
91
91
Now that you're familiar with liveness detection concepts, implement liveness detection in your app.
Formoreinformationontheseoperations, see [Facedetectionconcepts](../concept-face-detection.md) and [FaceRecognitionconcepts](../concept-face-recognition.md).
793
786
794
787
795
788
## Support options
796
789
797
790
Inadditiontousingthemain [AzureAIservicessupportoptions](../../cognitive-services-support-options.md), youcanalsopostyourquestionsinthe [issues](https://github.com/Azure-Samples/azure-ai-vision-sdk/issues) section of the SDK repo.
0 commit comments