Skip to content

Commit 7e1794d

Browse files
author
Jinyu Li
committed
refine the document structure,
1 parent 882ecff commit 7e1794d

File tree

2 files changed

+4
-6
lines changed

2 files changed

+4
-6
lines changed

articles/ai-services/computer-vision/concept-face-liveness-detection.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -34,9 +34,9 @@ The liveness solution integration involves two distinct components: a frontend m
3434

3535
:::image type="content" source="./media/liveness/liveness-diagram.jpg" alt-text="Diagram of the liveness workflow in Azure AI Face." lightbox="./media/liveness/liveness-diagram.jpg":::
3636

37-
- **Frontend application**: The frontend application receives authorization from the app server to initiate liveness detection. Its primary objective is to activate the camera and guide end-users accurately through the liveness detection process.
38-
- **App server**: The app server serves as a backend server to create liveness detection sessions and obtain an authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
39-
37+
- **Orchestrate azure AI service in app server**: The app server serves as a backend server to create liveness detection sessions and obtain an short-lived authorization token from the Face service for a particular session. This token authorizes the frontend application to perform liveness detection. The app server's objectives are to manage the sessions, to grant authorization for frontend application, and to view the results of the liveness detection process.
38+
- **Integrate azure AI vision SDK into frontend application**: The frontend application should embed the Azure AI Vision Face SDK (iOS, Android, or JavaScript). The SDK opens the camera, guides the user through the passive or passive-active flow, encrypts video frames, and streams them—together with the short-lived liveness-session token received from your server—directly to the Azure AI Face endpoint.
39+
- **Optional quick link path**: It is possible to avoid embedding the client SDK, the backend service can swap the same session token for a one-time Liveness Quick Link (https://liveness.face.azure.com/?s=…). Redirect the user to that URL and Azure will host the entire capture experience in the browser, then notice the completion through optional call back. This option lowers integration cost and automatically keeps you on Azure’s always-up-to-date experience.
4040

4141
## Liveness detection modes
4242

articles/ai-services/computer-vision/concept-face-liveness-quick-link.md renamed to articles/ai-services/computer-vision/tutorials/liveness-quick-link.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -123,9 +123,7 @@ The following is an example response:
123123
}
124124
```
125125

126-
Use that value to construct the liveness quick link web page: `https://liveness.face.azure.com/?s=60c3980c-d9f6-4b16-a7f5-f1f4ad2b506f`
127-
128-
3. Send the link to the user. You can redirect the browser, show a button, or display a QR code—anything that lets the user open the link on a camera-enabled device.
126+
3. Compose the link and sent it to the user. Use url response value to construct the liveness quick link web page, optionally you can also add callback URL: `https://liveness.face.azure.com/?s=60c3980c-d9f6-4b16-a7f5-f1f4ad2b506f&&callbackUrl=<encoded url>` You can redirect the browser or show a button—anything that lets the user open the link on a device.
129127
4. Azure hosts the capture experience. When the link opens, the Azure-operated page guides the user through the liveness check sequence using the latest Liveness web client.
130128
5. Get the outcome callback. As soon as the check finishes—or if the user abandons or times out—the quick link service notifies your callback endpoint so your application can decide what happens next.
131129

0 commit comments

Comments
 (0)