Skip to content

Commit 949ee2f

Browse files
Merge branch 'MicrosoftDocs:main' into main
2 parents 1217050 + cc6bb27 commit 949ee2f

File tree

586 files changed

+4730
-4391
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

586 files changed

+4730
-4391
lines changed

.openpublishing.redirection.ai-services-from-cog.json

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1670,11 +1670,6 @@
16701670
"redirect_url": "/azure/ai-services/speech-service/how-to-configure-openssl-linux",
16711671
"redirect_document_id": true
16721672
},
1673-
{
1674-
"source_path_from_root": "/articles/cognitive-services/speech-service/how-to-configure-rhel-centos-7.md",
1675-
"redirect_url": "/azure/ai-services/speech-service/how-to-configure-rhel-centos-7",
1676-
"redirect_document_id": true
1677-
},
16781673
{
16791674
"source_path_from_root": "/articles/cognitive-services/speech-service/how-to-control-connections.md",
16801675
"redirect_url": "/azure/ai-services/speech-service/how-to-control-connections",

.openpublishing.redirection.json

Lines changed: 36 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,35 @@
11
{
22
"redirections": [
3+
{
4+
"source_path": "articles/storage/files/files-samples-dotnet-v11.md",
5+
"redirect_url": "/previous-versions/azure/storage/files/files-samples-dotnet-v11",
6+
"redirect_document_id": false
7+
},
8+
{
9+
"source_path": "articles/storage/files/files-samples-java-v8.md",
10+
"redirect_url": "/previous-versions/azure/storage/files/files-samples-java-v8",
11+
"redirect_document_id": false
12+
},
13+
{
14+
"source_path": "articles/storage/files/files-samples-python-v2.md",
15+
"redirect_url": "/previous-versions/azure/storage/files/files-samples-python-v2",
16+
"redirect_document_id": false
17+
},
18+
{
19+
"source_path": "articles/storage/files/storage-c-plus-plus-how-to-use-files.md",
20+
"redirect_url": "/previous-versions/azure/storage/files/storage-c-plus-plus-how-to-use-files",
21+
"redirect_document_id": false
22+
},
23+
{
24+
"source_path": "articles/storage/queues/queues-v11-samples-dotnet.md",
25+
"redirect_url": "/previous-versions/azure/storage/queues/queues-v11-samples-dotnet",
26+
"redirect_document_id": false
27+
},
28+
{
29+
"source_path": "articles/storage/queues/queues-v2-samples-python.md",
30+
"redirect_url": "/previous-versions/azure/storage/queues/queues-v2-samples-python",
31+
"redirect_document_id": false
32+
},
333
{
434
"source_path": "articles/storage/files/storage-files-migration-storsimple-1200.md",
535
"redirect_url": "/previous-versions/azure/storage/files/storage-files-migration-storsimple-1200",
@@ -5049,6 +5079,11 @@
50495079
"source_path_from_root": "/articles/xplat-cli-install.md",
50505080
"redirect_url": "/cli/azure/install-azure-cli",
50515081
"redirect_document_id": false
5082+
},
5083+
{
5084+
"source_path_from_root": "/articles/virtual-network/template-samples.md",
5085+
"redirect_url": "/samples/browse/?expanded=azure&products=azure-resource-manager&terms=virtual%20network",
5086+
"redirect_document_id": false
50525087
}
50535088
]
5054-
}
5089+
}

articles/ai-services/.openpublishing.redirection.ai-services.json

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -384,6 +384,16 @@
384384
"source_path_from_root": "/articles/ai-services/speech-service/faq-voice-assistants.yml",
385385
"redirect_url": "/azure/ai-services/speech-service/custom-commands",
386386
"redirect_document_id": false
387+
},
388+
{
389+
"source_path_from_root": "/articles/cognitive-services/speech-service/how-to-configure-rhel-centos-7.md",
390+
"redirect_url": "/azure/ai-services/speech-service/quickstarts/setup-platform",
391+
"redirect_document_id": false
392+
},
393+
{
394+
"source_path_from_root": "/articles/ai-services/speech-service/how-to-configure-rhel-centos-7.md",
395+
"redirect_url": "/azure/ai-services/speech-service/quickstarts/setup-platform",
396+
"redirect_document_id": false
387397
}
388398

389399
]

articles/ai-services/computer-vision/how-to/identity-access-token.md

Lines changed: 29 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -112,61 +112,51 @@ curl -X POST 'https://<client-endpoint>/face/v1.0/identify' \
112112
113113
#### [C#](#tab/csharp)
114114

115-
The following code snippets show you how to use an access token with the [Face SDK for C#](https://www.nuget.org/packages/Microsoft.Azure.CognitiveServices.Vision.Face).
115+
The following code snippets show you how to use an access token with the [Face SDK for C#](https://aka.ms/azsdk-csharp-face-pkg).
116116

117-
The following class uses an access token to create a **ServiceClientCredentials** object that can be used to authenticate a Face API client object. It automatically adds the access token as a header in every request that the Face client will make.
117+
The following class uses an access token to create a **HttpPipelineSynchronousPolicy** object that can be used to authenticate a Face API client object. It automatically adds the access token as a header in every request that the Face client will make.
118118

119119
```csharp
120-
public class LimitedAccessTokenWithApiKeyClientCredential : ServiceClientCredentials
120+
public class LimitedAccessTokenPolicy : HttpPipelineSynchronousPolicy
121121
{
122-
/// <summary>
123-
/// Creates a new instance of the LimitedAccessTokenWithApiKeyClientCredential class
124-
/// </summary>
125-
/// <param name="apiKey">API Key for the Face API or CognitiveService endpoint</param>
126-
/// <param name="limitedAccessToken">LimitedAccessToken to bypass the limited access program, requires ISV sponsership.</param>
127-
128-
public LimitedAccessTokenWithApiKeyClientCredential(string apiKey, string limitedAccessToken)
129-
{
130-
this.ApiKey = apiKey;
131-
this.LimitedAccessToken = limitedAccessToken;
122+
/// <summary>
123+
/// Creates a new instance of the LimitedAccessTokenPolicy class
124+
/// </summary>
125+
/// <param name="limitedAccessToken">LimitedAccessToken to bypass the limited access program, requires ISV sponsership.</param>
126+
public LimitedAccessTokenPolicy(string limitedAccessToken)
127+
{
128+
_limitedAccessToken = limitedAccessToken;
132129
}
133130

134-
private readonly string ApiKey;
135-
private readonly string LimitedAccesToken;
136-
137-
/// <summary>
138-
/// Add the Basic Authentication Header to each outgoing request
139-
/// </summary>
140-
/// <param name="request">The outgoing request</param>
141-
/// <param name="cancellationToken">A token to cancel the operation</param>
142-
public override Task ProcessHttpRequestAsync(HttpRequestMessage request, CancellationToken cancellationToken)
143-
{
144-
if (request == null)
145-
throw new ArgumentNullException("request");
146-
request.Headers.Add("Ocp-Apim-Subscription-Key", ApiKey);
147-
request.Headers.Add("LimitedAccessToken", $"Bearer {LimitedAccesToken}");
148-
149-
return Task.FromResult<object>(null);
150-
}
151-
}
131+
private readonly string _limitedAccessToken;
132+
133+
/// <summary>
134+
/// Add the authentication header to each outgoing request
135+
/// </summary>
136+
/// <param name="message">The outgoing message</param>
137+
public override void OnSendingRequest(HttpMessage message)
138+
{
139+
message.Request.Headers.Add("LimitedAccessToken", $"Bearer {_limitedAccessToken}");
140+
}
141+
}
152142
```
153143

154144
In the client-side application, the helper class can be used like in this example:
155145

156146
```csharp
157-
static void Main(string[] args)
158-
{
147+
static void Main(string[] args)
148+
{
159149
// create Face client object
160-
var faceClient = new FaceClient(new LimitedAccessTokenWithApiKeyClientCredential(apiKey: "<client-face-key>", limitedAccessToken: "<token>"));
161-
162-
faceClient.Endpoint = "https://mytest-eastus2.cognitiveservices.azure.com";
150+
var clientOptions = new AzureAIVisionFaceClientOptions();
151+
clientOptions.AddPolicy(new LimitedAccessTokenPolicy("<token>"), HttpPipelinePosition.PerCall);
152+
FaceClient faceClient = new FaceClient(new Uri("<client-endpoint>"), new AzureKeyCredential("<client-face-key>"), clientOptions);
163153

164154
// use Face client in an API call
165-
using (var stream = File.OpenRead("photo.jpg"))
155+
using (var stream = File.OpenRead("photo.jpg"))
166156
{
167-
var result = faceClient.Face.DetectWithStreamAsync(stream, detectionModel: "Detection_03", recognitionModel: "Recognition_04", returnFaceId: true).Result;
157+
var response = faceClient.Detect(BinaryData.FromStream(stream), FaceDetectionModel.Detection03, FaceRecognitionModel.Recognition04, returnFaceId: true);
168158

169-
Console.WriteLine(JsonConvert.SerializeObject(result));
159+
Console.WriteLine(JsonConvert.SerializeObject(response.Value));
170160
}
171161
}
172162
```

articles/ai-services/computer-vision/how-to/specify-detection-model.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ var faces = response.Value;
7373

7474
The Face service can extract face data from an image and associate it with a **Person** object through the [Add Person Group Person Face] API. In this API call, you can specify the detection model in the same way as in [Detect].
7575

76-
See the following code example for the .NET client library.
76+
See the following .NET code example.
7777

7878
```csharp
7979
// Create a PersonGroup and add a person with face detected by "detection_03" model
@@ -110,7 +110,7 @@ This code creates a **PersonGroup** with ID `mypersongroupid` and adds a **Perso
110110
111111
## Add face to FaceList with specified model
112112

113-
You can also specify a detection model when you add a face to an existing **FaceList** object. See the following code example for the .NET client library.
113+
You can also specify a detection model when you add a face to an existing **FaceList** object. See the following .NET code example.
114114

115115
```csharp
116116
using (var content = new ByteArrayContent(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(new Dictionary<string, object> { ["name"] = "My face collection", ["recognitionModel"] = "recognition_04" }))))
@@ -139,6 +139,7 @@ In this article, you learned how to specify the detection model to use with diff
139139

140140
* [Face .NET SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-csharp%253fpivots%253dprogramming-language-csharp)
141141
* [Face Python SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-python%253fpivots%253dprogramming-language-python)
142+
* [Face Java SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-java%253fpivots%253dprogramming-language-java)
142143
* [Face JavaScript SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-javascript%253fpivots%253dprogramming-language-javascript)
143144

144145
[Detect]: /rest/api/face/face-detection-operations/detect

articles/ai-services/computer-vision/how-to/specify-recognition-model.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -132,6 +132,7 @@ In this article, you learned how to specify the recognition model to use with di
132132

133133
* [Face .NET SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-csharp%253fpivots%253dprogramming-language-csharp)
134134
* [Face Python SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-python%253fpivots%253dprogramming-language-python)
135+
* [Face Java SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-java%253fpivots%253dprogramming-language-java)
135136
* [Face JavaScript SDK](../quickstarts-sdk/identity-client-library.md?pivots=programming-language-javascript%253fpivots%253dprogramming-language-javascript)
136137

137138
[Detect]: /rest/api/face/face-detection-operations/detect

articles/ai-services/document-intelligence/concept-accuracy-confidence.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,9 +16,10 @@ ms.author: lajanuar
1616

1717
A confidence score indicates probability by measuring the degree of statistical certainty that the extracted result is detected correctly. The estimated accuracy is calculated by running a few different combinations of the training data to predict the labeled values. In this article, learn to interpret accuracy and confidence scores and best practices for using those scores to improve accuracy and confidence results.
1818

19-
2019
## Confidence scores
20+
2121
> [!NOTE]
22+
>
2223
> * Field level confidence is getting update to take into account word confidence score starting with **2024-07-31-preview** API version for **custom models**.
2324
> * Confidence scores for tables, table rows and table cells are available starting with the **2024-07-31-preview** API version for **custom models**.
2425

0 commit comments

Comments
 (0)