Skip to content

Commit 1a58b93

Browse files
committed
fix code snippets and links
1 parent 211d345 commit 1a58b93

File tree

6 files changed

+15
-26
lines changed

6 files changed

+15
-26
lines changed

articles/ai-services/computer-vision/concept-face-recognition-data-structures.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,10 +19,6 @@ ms.author: pafarley
1919

2020
This article explains the data structures used in the Face service for face recognition operations. These data structures hold data on faces and persons.
2121

22-
You can try out the capabilities of face recognition quickly and easily using Vision Studio.
23-
> [!div class="nextstepaction"]
24-
> [Try Vision Studio](https://portal.vision.cognitive.azure.com/)
25-
2622
[!INCLUDE [Gate notice](./includes/identity-gate-notice.md)]
2723

2824
## Data structures used with Identify

articles/ai-services/computer-vision/concept-face-recognition.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,10 +19,6 @@ ms.author: pafarley
1919

2020
This article explains the concept of Face recognition, its related operations, and the underlying data structures. Broadly, face recognition is the process of verifying or identifying individuals by their faces. Face recognition is important in implementing the identification scenario, which enterprises and apps can use to verify that a (remote) user is who they claim to be.
2121

22-
You can try out the capabilities of face recognition quickly and easily using Vision Studio.
23-
> [!div class="nextstepaction"]
24-
> [Try Vision Studio](https://portal.vision.cognitive.azure.com/)
25-
2622

2723
## Face recognition operations
2824

articles/ai-services/computer-vision/how-to/identity-access-token.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -33,9 +33,9 @@ If the ISV learns that a client is using the LimitedAccessToken for non-approved
3333

3434
## Prerequisites
3535

36-
* [cURL](https://curl.haxx.se/) installed (or another tool that can make HTTP requests).
37-
* The ISV needs to have either an [Azure AI Face](https://ms.portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/Face) resource or an [Azure AI services multi-service](https://ms.portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/AllInOne) resource.
38-
* The client needs to have an [Azure AI Face](https://ms.portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/Face) resource.
36+
* [cURL](https://curl.se/) installed (or another tool that can make HTTP requests).
37+
* The ISV needs to have either an [Azure AI Face](https://portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/Face) resource or an [Azure AI services multi-service](https://portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/AllInOne) resource.
38+
* The client needs to have an [Azure AI Face](https://portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/Face) resource.
3939

4040
## Step 1: ISV obtains client's Face resource ID
4141

articles/ai-services/computer-vision/how-to/specify-detection-model.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ A request URL for the [Detect] REST API will look like this:
6464
If you are using the client library, you can assign the value for `detectionModel` by passing in an appropriate string. If you leave it unassigned, the API will use the default model version (`detection_01`). See the following code example for the .NET client library.
6565

6666
```csharp
67-
string imageUrl = "https://news.microsoft.com/ceo/assets/photos/06_web.jpg";
67+
string imageUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-sample-data-files/master/Face/images/detection1.jpg";
6868
var faces = await faceClient.Face.DetectWithUrlAsync(url: imageUrl, returnFaceId: false, returnFaceLandmarks: false, recognitionModel: "recognition_04", detectionModel: "detection_03");
6969
```
7070

@@ -81,7 +81,7 @@ await faceClient.PersonGroup.CreateAsync(personGroupId, "My Person Group Name",
8181

8282
string personId = (await faceClient.PersonGroupPerson.CreateAsync(personGroupId, "My Person Name")).PersonId;
8383

84-
string imageUrl = "https://news.microsoft.com/ceo/assets/photos/06_web.jpg";
84+
string imageUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-sample-data-files/master/Face/images/detection1.jpg";
8585
await client.PersonGroupPerson.AddFaceFromUrlAsync(personGroupId, personId, imageUrl, detectionModel: "detection_03");
8686
```
8787

@@ -97,7 +97,7 @@ You can also specify a detection model when you add a face to an existing **Face
9797
```csharp
9898
await faceClient.FaceList.CreateAsync(faceListId, "My face collection", recognitionModel: "recognition_04");
9999

100-
string imageUrl = "https://news.microsoft.com/ceo/assets/photos/06_web.jpg";
100+
string imageUrl = "https://raw.githubusercontent.com/Azure-Samples/cognitive-services-sample-data-files/master/Face/images/detection1.jpg";
101101
await client.FaceList.AddFaceFromUrlAsync(faceListId, imageUrl, detectionModel: "detection_03");
102102
```
103103

articles/ai-services/computer-vision/how-to/use-large-scale.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -92,13 +92,13 @@ private static async Task TrainLargeFaceList(
9292
int timeIntervalInMilliseconds = 1000)
9393
{
9494
// Trigger a train call.
95-
await FaceClient.LargeTrainLargeFaceListAsync(largeFaceListId);
95+
await FaceClient.LargeFaceList.TrainAsync(largeFaceListId);
9696

9797
// Wait for training finish.
9898
while (true)
9999
{
100-
Task.Delay(timeIntervalInMilliseconds).Wait();
101-
var status = await faceClient.LargeFaceList.TrainAsync(largeFaceListId);
100+
await Task.Delay(timeIntervalInMilliseconds);
101+
var status = await faceClient.LargeFaceList.GetTrainingStatusAsyn(largeFaceListId);
102102

103103
if (status.Status == Status.Running)
104104
{
@@ -123,7 +123,7 @@ Previously, a typical use of **FaceList** with added faces and **FindSimilar** l
123123
const string FaceListId = "myfacelistid_001";
124124
const string FaceListName = "MyFaceListDisplayName";
125125
const string ImageDir = @"/path/to/FaceList/images";
126-
faceClient.FaceList.CreateAsync(FaceListId, FaceListName).Wait();
126+
await faceClient.FaceList.CreateAsync(FaceListId, FaceListName);
127127

128128
// Add Faces to the FaceList.
129129
Parallel.ForEach(
@@ -141,7 +141,7 @@ const string QueryImagePath = @"/path/to/query/image";
141141
var results = new List<SimilarPersistedFace[]>();
142142
using (Stream stream = File.OpenRead(QueryImagePath))
143143
{
144-
var faces = faceClient.Face.DetectWithStreamAsync(stream).Result;
144+
var faces = await faceClient.Face.DetectWithStreamAsync(stream);
145145
foreach (var face in faces)
146146
{
147147
results.Add(await faceClient.Face.FindSimilarAsync(face.FaceId, FaceListId, 20));
@@ -156,7 +156,7 @@ When migrating it to **LargeFaceList**, it becomes the following:
156156
const string LargeFaceListId = "mylargefacelistid_001";
157157
const string LargeFaceListName = "MyLargeFaceListDisplayName";
158158
const string ImageDir = @"/path/to/FaceList/images";
159-
faceClient.LargeFaceList.CreateAsync(LargeFaceListId, LargeFaceListName).Wait();
159+
await faceClient.LargeFaceList.CreateAsync(LargeFaceListId, LargeFaceListName);
160160

161161
// Add Faces to the LargeFaceList.
162162
Parallel.ForEach(
@@ -178,7 +178,7 @@ const string QueryImagePath = @"/path/to/query/image";
178178
var results = new List<SimilarPersistedFace[]>();
179179
using (Stream stream = File.OpenRead(QueryImagePath))
180180
{
181-
var faces = faceClient.Face.DetectWithStreamAsync(stream).Result;
181+
var faces = await faceClient.Face.DetectWithStreamAsync(stream);
182182
foreach (var face in faces)
183183
{
184184
results.Add(await faceClient.Face.FindSimilarAsync(face.FaceId, largeFaceListId: LargeFaceListId));

articles/ai-services/computer-vision/how-to/use-persondirectory.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ var client = new HttpClient();
5656
// Request headers
5757
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "{subscription key}");
5858

59-
var addPersonUri = "https:// {endpoint}/face/v1.0-preview/persons";
59+
var addPersonUri = "https://{endpoint}/face/v1.0-preview/persons";
6060

6161
HttpResponseMessage response;
6262

@@ -113,10 +113,7 @@ Stopwatch s = Stopwatch.StartNew();
113113
string status = "notstarted";
114114
do
115115
{
116-
if (status == "succeeded")
117-
{
118-
await Task.Delay(500);
119-
}
116+
await Task.Delay(500);
120117

121118
var operationResponseMessage = await client.GetAsync(operationLocation);
122119

0 commit comments

Comments
 (0)