Skip to content

Commit 3890148

Browse files
authored
Merge pull request #5498 from JinyuID/main
move liveness quick link to tutorial and refine liveness concept page
2 parents 2099752 + e13fc74 commit 3890148

File tree

4 files changed

+10
-13
lines changed

4 files changed

+10
-13
lines changed

articles/ai-services/computer-vision/concept-face-liveness-detection.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -34,17 +34,17 @@ The liveness solution integration involves two distinct components: a frontend m
3434

3535
:::image type="content" source="./media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="./media/liveness/liveness-diagram.jpg":::
3636

37-
- **Frontend application**: The frontend application receives authorization from the app server to initiate liveness detection. Its primary objective is to activate the camera and guide end-users accurately through the liveness detection process.
38-
- **App server**: The app server serves as a backend server to create liveness detection sessions and obtain an authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
39-
37+
- **Orchestrate Azure AI service in app server**: The app server serves as a backend server to create liveness detection sessions and obtain a short-lived authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, grant authorization for frontend application, and view the results of the liveness detection process.
38+
- **Integrate Azure AI vision SDK into frontend application**: The frontend application should embed the Azure AI Vision Face SDK (iOS, Android, or JavaScript). The SDK opens the camera, guides the user through the passive or passive-active flow, encrypts video frames, and streams them—together with the short-lived liveness-session token received from your server—directly to the Azure AI Face endpoint.
39+
- **Optional quick link path**: It is possible to avoid embedding the client SDK. The backend service can swap the same session token for a one-time Liveness Quick Link (`https://liveness.face.azure.com/?s=…`). Redirect the user to that URL, and Azure hosts the entire capture experience in the browser, then notices the completion through optional callback. This option lowers integration cost and automatically keeps you on Azure’s always-up-to-date experience.
4040

4141
## Liveness detection modes
4242

4343
Azure Face liveness detection API includes options for both Passive and Passive-Active detection modes.
4444

45-
The **Passive mode** utilizes a passive liveness technique that requires no additional actions from the user. It requires a non-bright lighting environment to succeed and will fail in bright lighting environments with an "Environment not supported" error. It also requires high screen brightness for optimal performance which is configured automatically in the Mobile (iOS and Android) solutions. This mode can be chosen if you prefer minimal end-user interaction and expect end-users to primarily be in non-bright environments. A Passive mode check takes around 12 seconds on an average to complete.
45+
The **Passive mode** utilizes a passive liveness technique that requires no extra actions from the user. It requires a non-bright lighting environment to succeed and might fail in bright lighting environments with an "Environment not supported" error. It also requires high screen brightness for optimal performance which is configured automatically in the Mobile (iOS and Android) solutions. This mode can be chosen if you prefer minimal end-user interaction and expect end-users to primarily be in non-bright environments. A Passive mode check takes around 12 seconds on an average to complete.
4646

47-
The **Passive-Active mode** will behave the same as the Passive mode in non-bright lighting environments and only trigger the Active mode in bright lighting environments. This mode is preferable on Web browser solutions due to the lack of automatic screen brightness control available on browsers which hinders the Passive mode's operational envelope. This mode can be chosen if you want the liveness-check to work in any lighting environment. If the Active check is triggered due to a bright lighting environment, then the total completion time may take up to 20 seconds on average.
47+
The **Passive-Active mode** behaves the same as the Passive mode in non-bright lighting environments and only trigger the Active mode in bright lighting environments. This mode is preferable on Web browser solutions due to the lack of automatic screen brightness control available on browsers which hinders the Passive mode's operational envelope. This mode can be chosen if you want the liveness-check to work in any lighting environment. If the Active check is triggered due to a bright lighting environment, then the total completion time may take up to 20 seconds on average.
4848

4949
You can set the detection mode during the session creation step (see [Perform liveness detection](./tutorials/liveness.md#perform-liveness-detection)).
5050

-76.9 KB
Loading

articles/ai-services/computer-vision/toc.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -223,8 +223,6 @@ items:
223223
href: concept-face-liveness-detection.md
224224
- name: Face liveness abuse monitoring
225225
href: concept-liveness-abuse-monitoring.md
226-
- name: Face liveness quick link (preview)
227-
href: concept-face-liveness-quick-link.md
228226

229227
- name: How-to guides
230228
items:
@@ -262,6 +260,8 @@ items:
262260
href: Tutorials/liveness.md
263261
- name: Add users to a Face identification app
264262
href: Tutorials/build-enrollment-app.md
263+
- name: Face liveness quick link (preview)
264+
href: Tutorials/liveness-quick-link.md
265265
- name: Samples
266266
href: https://aka.ms/FaceSamples
267267

articles/ai-services/computer-vision/concept-face-liveness-quick-link.md renamed to articles/ai-services/computer-vision/tutorials/liveness-quick-link.md

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This article explains the concept of Face liveness quick link, its usage flow, a
1919

2020
## Introduction
2121

22-
Azure Face Liveness quick link is an optional integration path for [Face liveness detection](concept-face-liveness-detection.md). It exchanges a liveness session’s session-authorization-token for a single-use URL that hosts the face capture experience on an Azure-operated page. The service returns to a developer-supplied callback endpoint after finishing the operation.
22+
Azure Face Liveness quick link is an optional integration path for [Face liveness detection](../concept-face-liveness-detection.md). It exchanges a liveness session’s session-authorization-token for a single-use URL that hosts the face capture experience on an Azure-operated page. The service returns to a developer-supplied callback endpoint after finishing the operation.
2323

2424
Azure Liveness quick link provides multiple benefits to customers:
2525
- You don't need to embed the liveness client SDK. That allows for easier integration on the application side.
@@ -29,7 +29,7 @@ Azure Liveness quick link provides multiple benefits to customers:
2929

3030
You can use the liveness quick link website, `liveness.face.azure.com`, to turn a liveness session into a shareable, single use link:
3131

32-
:::image type="content" source="media/liveness/liveness-quick-link-diagram.png" alt-text="A diagram illustrates liveness quick link work flow.":::
32+
:::image type="content" source="../media/liveness/liveness-quick-link-diagram.png" alt-text="A diagram illustrates liveness quick link work flow.":::
3333

3434
1. Start a session with your server-side code. Your application backend requests a new liveness session from the Face API and receives a short-lived authorization token that represents that session.
3535
2. Swap the session token for a link. Your application backend sends the token to the quick link service, which creates a one-time URL connected to the session. Here are examples of the post request:
@@ -114,7 +114,6 @@ You can use the liveness quick link website, `liveness.face.azure.com`, to turn
114114
--header 'authorization: Bearer <session-authorization-token>'
115115
```
116116

117-
118117
The following is an example response:
119118

120119
```json
@@ -123,9 +122,7 @@ The following is an example response:
123122
}
124123
```
125124

126-
Use that value to construct the liveness quick link web page: `https://liveness.face.azure.com/?s=60c3980c-d9f6-4b16-a7f5-f1f4ad2b506f`
127-
128-
3. Send the link to the user. You can redirect the browser, show a button, or display a QR code—anything that lets the user open the link on a camera-enabled device.
125+
3. Compose the link and sent it to the user. Use the URL response value to construct the liveness quick link web page. Optionally you can also add a callback URL: `https://liveness.face.azure.com/?s=60c3980c-d9f6-4b16-a7f5-f1f4ad2b506f&&callbackUrl=<encoded url>` You can redirect the browser or show a button—anything that lets the user open the link on a device.
129126
4. Azure hosts the capture experience. When the link opens, the Azure-operated page guides the user through the liveness check sequence using the latest Liveness web client.
130127
5. Get the outcome callback. As soon as the check finishes—or if the user abandons or times out—the quick link service notifies your callback endpoint so your application can decide what happens next.
131128

0 commit comments

Comments
 (0)