Skip to content

Commit 91e975a

Browse files
committed
Remove the language table for SDK
1 parent 4fd21fe commit 91e975a

File tree

1 file changed

+4
-14
lines changed
  • articles/ai-services/computer-vision/Tutorials

1 file changed

+4
-14
lines changed

articles/ai-services/computer-vision/Tutorials/liveness.md

Lines changed: 4 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -53,19 +53,9 @@ This tutorial demonstrates how to operate a frontend application and an app serv
5353

5454
### Setup frontend applications and app servers to perform liveness detection
5555

56-
We provide SDKs in different languages for frontend applications and app servers. They're available in these languages:
56+
We provide SDKs in different languages for frontend applications and app servers. See the following instructions to setup your frontend applications and app servers.
5757

58-
| Language | App server | Frontend application |
59-
| -------- | -- | -- |
60-
| C# ||-|
61-
| Java ||-|
62-
| Python ||-|
63-
| JavaScript ||✔(Not support providing a reference image.)|
64-
| Restful API ||-|
65-
| Kotlin |-||
66-
| Swift |-||
67-
68-
#### Integrate liveness into mobile application
58+
#### Integrate liveness into frontend application
6959

7060
Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk](https://github.com/Azure-Samples/azure-ai-vision-sdk) GitHub repository to integrate the UI and the code into your native mobile application. The liveness SDK supports Java/Kotlin for Android mobile applications, Swift for iOS mobile applications and JavaScript for web applications:
7161
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
@@ -241,7 +231,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
241231

242232
1. The SDK then starts the camera, guides the user to position correctly and then prepares the payload to call the liveness detection service endpoint.
243233

244-
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the mobile application that the liveness check has been completed.
234+
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the frontend application that the liveness check has been completed.
245235

246236
1. The frontend application relays the liveness check completion to the app server.
247237

@@ -605,7 +595,7 @@ The high-level steps involved in liveness with verification orchestration are il
605595
}
606596
```
607597

608-
- The mobile application provides the reference image when initializing the SDK. This scenario is not supported in the web solution.
598+
- The frontend application provides the reference image when initializing the SDK. This scenario is not supported in the web solution.
609599

610600
#### [Android](#tab/mobile-kotlin)
611601
```kotlin

0 commit comments

Comments
 (0)