You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/Tutorials/liveness.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ Face Liveness detection can be used to determine if a face in an input video str
16
16
17
17
The goal of liveness detection is to ensure that the system is interacting with a physically present live person at the time of authentication. Such systems have become increasingly important with the rise of digital finance, remote access control, and online identity verification processes.
18
18
19
-
The liveness detection solution successfully defends against a variety of spoof types ranging from paper printouts, 2d/3d masks, and spoof presentations on phones and laptops. Liveness detection is an active area of research, with continuous improvements being made to counteract increasingly sophisticated spoofing attacks over time. Continuous improvements will be rolled out to the client and the service components over time as the overall solution gets more robust to new types of attacks.
19
+
The liveness detection solution successfully defends against various spoof types ranging from paper printouts, 2d/3d masks, and spoof presentations on phones and laptops. Liveness detection is an active area of research, with continuous improvements being made to counteract increasingly sophisticated spoofing attacks over time. Continuous improvements will be rolled out to the client and the service components over time as the overall solution gets more robust to new types of attacks.
@@ -40,7 +40,7 @@ Once you have access to the SDK, follow instruction in the [azure-ai-vision-sdk]
40
40
- For Swift iOS, follow the instructions in the [iOS sample](https://aka.ms/azure-ai-vision-face-liveness-client-sdk-ios-readme)
41
41
- For Kotlin/Java Android, follow the instructions in the [Android sample](https://aka.ms/liveness-sample-java)
42
42
43
-
Once you've added the code into your application, the SDK will handle starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
43
+
Once you've added the code into your application, the SDK handles starting the camera, guiding the end-user to adjust their position, composing the liveness payload, and calling the Azure AI Face cloud service to process the liveness payload.
44
44
45
45
### Orchestrate the liveness solution
46
46
@@ -86,7 +86,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
86
86
87
87
1. The SDK then starts the camera, guides the user to position correctly and then prepares the payload to call the liveness detection service endpoint.
88
88
89
-
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK will notify the mobile application that the liveness check has been completed.
89
+
1. The SDK calls the Azure AI Vision Face service to perform the liveness detection. Once the service responds, the SDK notifies the mobile application that the liveness check has been completed.
90
90
91
91
1. The mobile application relays the liveness check completion to the app server.
92
92
@@ -110,7 +110,7 @@ The high-level steps involved in liveness orchestration are illustrated below:
@@ -162,12 +162,12 @@ Use the following tips to ensure that your input images give the most accurate r
162
162
#### Composition requirements:
163
163
- Photo is clear and sharp, not blurry, pixelated, distorted, or damaged.
164
164
- Photo is not altered to remove face blemishes or face appearance.
165
-
- Photo must be in an RGB color supported format (JPEG, PNG, WEBP, BMP). Recommended Face size is 200 pixels x 200 pixels. Face sizes larger than 200 pixels x 200 pixels will not result in better AI quality, and no larger than 6MB in size.
165
+
- Photo must be in an RGB color supported format (JPEG, PNG, WEBP, BMP). Recommended Face size is 200 pixels x 200 pixels. Face sizes larger than 200 pixels x 200 pixels will not result in better AI quality, and no larger than 6 MB in size.
166
166
- User is not wearing glasses, masks, hats, headphones, head coverings, or face coverings. Face should be free of any obstructions.
167
167
- Facial jewelry is allowed provided they do not hide your face.
168
168
- Only one face should be visible in the photo.
169
169
- Face should be in neutral front-facing pose with both eyes open, mouth closed, with no extreme facial expressions or head tilt.
170
-
- Face should be free of any shadows or red eyes. Please retake photo if either of these occur.
170
+
- Face should be free of any shadows or red eyes. Retake photo if either of these occur.
171
171
- Background should be uniform and plain, free of any shadows.
172
172
- Face should be centered within the image and fill at least 50% of the image.
173
173
@@ -243,7 +243,7 @@ The high-level steps involved in liveness with verification orchestration are il
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/how-to/shelf-analyze.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,7 +25,7 @@ The fastest way to start using Product Recognition is to use the built-in pretra
25
25
* Once you have your Azure subscription, <ahref="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision"title="create a Vision resource"target="_blank">create a Vision resource</a> in the Azure portal. It must be deployed in the **East US** or **West US 2** region. After it deploys, select **Go to resource**.
26
26
* You'll need the key and endpoint from the resource you create to connect your application to the Azure AI Vision service. You'll paste your key and endpoint into the code below later in the guide.
27
27
* An Azure Storage resource with a blob storage container. [Create one](/azure/storage/common/storage-account-create?tabs=azure-portal)
28
-
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Postman, Swagger, or the REST Client extension for VS Code.
28
+
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Swagger or the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) extension for VS Code.
29
29
* A shelf image. You can download our [sample image](https://github.com/Azure-Samples/cognitive-services-sample-data-files/blob/master/ComputerVision/shelf-analysis/shelf.png) or bring your own images. The maximum file size per image is 20 MB.
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/how-to/shelf-modify-images.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ This guide also shows you how to use the **Rectification API** to correct for pe
24
24
* Once you have your Azure subscription, <ahref="https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision"title="create a Vision resource"target="_blank">create a Vision resource</a> in the Azure portal. It must be deployed in the **East US** or **West US 2** region. After it deploys, select **Go to resource**.
25
25
* You'll need the key and endpoint from the resource you create to connect your application to the Azure AI Vision service. You'll paste your key and endpoint into the code below later in the quickstart.
26
26
* An Azure Storage resource with a blob storage container. [Create one](/azure/storage/common/storage-account-create?tabs=azure-portal)
27
-
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Postman, Swagger, or the REST Client extension for VS Code.
27
+
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Swagger or the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) extension for VS Code.
28
28
* A set of photos that show adjacent parts of the same shelf. A 50% overlap between images is recommended. You can download and use the sample "unstitched" images from [GitHub](https://github.com/Azure-Samples/cognitive-services-sample-data-files/tree/master/ComputerVision/shelf-analysis).
Copy file name to clipboardExpand all lines: articles/ai-services/computer-vision/how-to/shelf-planogram.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ A planogram is a diagram that indicates the correct placement of retail products
22
22
23
23
## Prerequisites
24
24
* You must have already set up and run basic [Product Understanding analysis](./shelf-analyze.md) with the Product Understanding API.
25
-
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Postman, Swagger, or the REST Client extension for VS Code.
25
+
*[cURL](https://curl.haxx.se/) installed. Or, you can use a different REST platform, like Swagger or the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) extension for VS Code.
Copy file name to clipboardExpand all lines: articles/ai-services/custom-vision-service/copy-move-projects.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ ms.author: pafarley
14
14
15
15
After you've created and trained a Custom Vision project, you may want to copy your project to another resource. If your app or business depends on a Custom Vision project, we recommend you copy your model to another Custom Vision account in another region. Then if a regional outage occurs, you can access your project in the region where it was copied.
16
16
17
-
The **[ExportProject](https://westus2.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)** and **[ImportProject](https://westus2.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc7548b571998fddee3)** APIs enable this scenario by allowing you to copy projects from one Custom Vision account into others. This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service like Postman to issue the requests.
17
+
The **[ExportProject](https://westus2.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc6548b571998fddeb3)** and **[ImportProject](https://westus2.dev.cognitive.microsoft.com/docs/services/Custom_Vision_Training_3.3/operations/5eb0bcc7548b571998fddee3)** APIs enable this scenario by allowing you to copy projects from one Custom Vision account into others. This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service, like the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) for Visual Studio Code, to issue the requests.
18
18
19
19
> [!TIP]
20
20
> For an example of this scenario using the Python client library, see the [Move Custom Vision Project](https://github.com/Azure-Samples/custom-vision-move-project/tree/master/) repository on GitHub.
Copy file name to clipboardExpand all lines: articles/ai-services/custom-vision-service/storage-integration.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ You can integrate your Custom Vision project with an Azure blob storage queue to
17
17
18
18
You can also use Azure storage to store backup copies of your published models.
19
19
20
-
This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service like Postman to make the requests.
20
+
This guide shows you how to use these REST APIs with cURL. You can also use an HTTP request service, like the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) for Visual Studio Code, to make the requests.
21
21
22
22
> [!NOTE]
23
23
> Push notifications depend on the optional _notificationQueueUri_ parameter in the **CreateProject** API, and model backups require that you also use the optional _exportModelContainerUri_ parameter. This guide will use both for the full set of features.
Copy file name to clipboardExpand all lines: articles/azure-government/documentation-government-overview-wwps.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -158,7 +158,7 @@ Data encryption provides isolation assurances that are tied directly to encrypti
158
158
159
159
Proper protection and management of encryption keys is essential for data security. **[Azure Key Vault](../key-vault/index.yml) is a cloud service for securely storing and managing secrets.** The Key Vault service supports two resource types:
160
160
161
-
-**[Vault](../key-vault/general/overview.md)** supports software-protected and hardware security module (HSM)-protected [secrets, keys, and certificates](../key-vault/general/about-keys-secrets-certificates.md). Vaults provide a multi-tenant, low-cost, easy to deploy, zone-resilient (where available), and highly available key management solution suitable for most common cloud application scenarios. The corresponding HSMs have [FIPS 140 Level 2](/azure/compliance/offerings/offering-fips-140-2) validation.
161
+
-**[Vault](../key-vault/general/overview.md)** supports software-protected and hardware security module (HSM)-protected [secrets, keys, and certificates](../key-vault/general/about-keys-secrets-certificates.md). Vaults provide a multi-tenant, low-cost, easy to deploy, zone-resilient (where available), and highly available key management solution suitable for most common cloud application scenarios. The corresponding HSMs have [FIPS 140 validation](/azure/key-vault/keys/about-keys#compliance).
162
162
-**[Managed HSM](../key-vault/managed-hsm/overview.md)** supports only HSM-protected cryptographic keys. It provides a single-tenant, fully managed, highly available, zone-resilient (where available) HSM as a service to store and manage your cryptographic keys. It's most suitable for applications and usage scenarios that handle high value keys. It also helps you meet the most stringent security, compliance, and regulatory requirements. Managed HSM uses [FIPS 140 Level 3](/azure/compliance/offerings/offering-fips-140-2) validated HSMs to protect your cryptographic keys.
163
163
164
164
Key Vault enables you to store your encryption keys in hardware security modules (HSMs) that are FIPS 140 validated. With Azure Key Vault, you can import or generate encryption keys in HSMs, ensuring that keys never leave the HSM protection boundary to support *bring your own key* (BYOK) scenarios. **Keys generated inside the Azure Key Vault HSMs aren't exportable – there can be no clear-text version of the key outside the HSMs.** This binding is enforced by the underlying HSM.
@@ -385,7 +385,7 @@ Listed below are key enabling technologies and services that you may find helpfu
385
385
386
386
- All recommended technologies used for Unclassified data, especially services such as [Virtual Network](../virtual-network/virtual-networks-overview.md) (VNet), [Microsoft Defender for Cloud](../defender-for-cloud/index.yml), and [Azure Monitor](../azure-monitor/index.yml).
387
387
- Public IP addresses are disabled allowing only traffic through private connections, including [ExpressRoute](../expressroute/index.yml) and [Virtual Private Network](../vpn-gateway/index.yml) (VPN) gateway.
388
-
- Data encryption is recommended with customer-managed keys (CMK) in [Azure Key Vault](../key-vault/index.yml) backed by multi-tenant hardware security modules (HSMs) that have FIPS 140 Level 2 validation.
388
+
- Data encryption is recommended with customer-managed keys (CMK) in [Azure Key Vault](../key-vault/index.yml) backed by multi-tenant hardware security modules (HSMs) that have [FIPS 140 validation](/azure/key-vault/keys/about-keys#compliance).
389
389
- Only services that support [VNet integration](../virtual-network/virtual-network-for-azure-services.md) options are enabled. Azure VNet enables you to place Azure resources in a non-internet routable network, which can then be connected to your on-premises network using VPN technologies. VNet integration gives web apps access to resources in the virtual network.
390
390
- You can use [Azure Private Link](../private-link/index.yml) to access Azure PaaS services over a private endpoint in your VNet, ensuring that traffic between your VNet and the service travels across the Microsoft global backbone network, which eliminates the need to expose the service to the public Internet.
391
391
-[Customer Lockbox](../security/fundamentals/customer-lockbox-overview.md) for Azure enables you to approve/deny elevated access requests for your data in support scenarios. It’s an extension of the Just-in-Time (JIT) workflow that comes with full audit logging enabled.
0 commit comments