You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/active-directory/develop/publisher-verification-overview.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ ms.workload: identity
11
11
ms.date: 06/01/2021
12
12
ms.author: ryanwi
13
13
ms.custom: aaddev
14
-
ms.reviewer: jesakowi
14
+
ms.reviewer: ardhanap, jesakowi
15
15
---
16
16
17
17
# Publisher verification
@@ -58,7 +58,7 @@ There are a few pre-requisites for publisher verification, some of which will ha
58
58
59
59
- In Partner Center this user must have of the following [roles](/partner-center/permissions-overview): MPN Admin, Accounts Admin, or a Global Admin (this is a shared role mastered in Azure AD).
60
60
61
-
- The user performing verification must sign in using [multifactor authentication](../authentication/howto-mfa-getstarted.md).
61
+
- The user performing verification must sign in using [multi-factor authentication](../authentication/howto-mfa-getstarted.md).
62
62
63
63
- The publisher agrees to the [Microsoft identity platform for developers Terms of Use](/legal/microsoft-identity-platform/terms-of-use).
Copy file name to clipboardExpand all lines: articles/aks/aks-resource-health.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,6 +44,7 @@ Resource Health receives signals for your managed cluster to determine the clust
44
44
45
45
-**Degraded**: When there is a health issue requiring your action, Resource Health reports your cluster as *Degraded*.
46
46
47
+
Note that the Resource Health for an AKS cluster is different than the Resource Health of its individual resources (*Virtual Machines, ScaleSet Instances, Load Balancer, etc...*).
47
48
For additional details on what each health status indicates, visit [Resource Health overview](../service-health/resource-health-overview.md#health-status).
48
49
49
50
### View historical data
@@ -52,4 +53,4 @@ You can also view the past 30 days of historical Resource Health information in
52
53
53
54
## Next steps
54
55
55
-
Run checks on your cluster to further troubleshoot cluster issues by using [AKS Diagnostics](./concepts-diagnostics.md).
56
+
Run checks on your cluster to further troubleshoot cluster issues by using [AKS Diagnostics](./concepts-diagnostics.md).
Copy file name to clipboardExpand all lines: articles/applied-ai-services/form-recognizer/concept-custom.md
+11-9Lines changed: 11 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,9 @@ Custom models can be one of two types, [**custom template**](concept-custom-temp
23
23
24
24
### Custom template model
25
25
26
-
The custom template or custom form model relies on a consistent visual template to extract the labeled data. The accuracy of your model is affected by variances in the visual structure of your documents. Structured forms such as questionnaires or applications are examples of consistent visual templates. Your training set will consist of structured documents where the formatting and layout are static and constant from one document instance to the next. Custom template models support key-value pairs, selection marks, tables, signature fields and regions and can be trained on documents in any of the [supported languages](language-support.md). For more information, *see*[custom template models](concept-custom-template.md).
26
+
The custom template or custom form model relies on a consistent visual template to extract the labeled data. The accuracy of your model is affected by variances in the visual structure of your documents. Structured forms such as questionnaires or applications are examples of consistent visual templates.
27
+
28
+
Your training set will consist of structured documents where the formatting and layout are static and constant from one document instance to the next. Custom template models support key-value pairs, selection marks, tables, signature fields, and regions. Template models and can be trained on documents in any of the [supported languages](language-support.md). For more information, *see*[custom template models](concept-custom-template.md).
27
29
28
30
> [!TIP]
29
31
>
@@ -33,15 +35,15 @@ Custom models can be one of two types, [**custom template**](concept-custom-temp
33
35
34
36
### Custom neural model
35
37
36
-
The custom neural (custom document) model is a deep learning model type that relies on a base model trained on a large collection of documents. This model is then fine-tuned or adapted to your data when you train the model with a labeled dataset. Custom neural models support structured, semi-structured, and unstructured documents to extract fields. Custom neural models currently support English-language documents. When you're choosing between the two model types, start with a neural model if it meets your functional needs. See [neural models](concept-custom-neural.md) to learn more about custom document models.
38
+
The custom neural (custom document) model uses deep learning models and base model trained on a large collection of documents. This model is then fine-tuned or adapted to your data when you train the model with a labeled dataset. Custom neural models support structured, semi-structured, and unstructured documents to extract fields. Custom neural models currently support English-language documents. When you're choosing between the two model types, start with a neural model if it meets your functional needs. See [neural models](concept-custom-neural.md) to learn more about custom document models.
37
39
38
40
## Build mode
39
41
40
42
The build custom model operation has added support for the *template* and *neural* custom models. Previous versions of the REST API and SDKs only supported a single build mode that is now known as the *template* mode.
41
43
42
44
* Template models only accept documents that have the same basic page structure—a uniform visual appearance—or the same relative positioning of elements within the document.
43
45
44
-
* Neural models support documents that have the same information, but different page structures. Examples of these documents include United States W2 forms, which share the same information, but may vary in appearance by the company that created the document. Neural models currently only support English text.
46
+
* Neural models support documents that have the same information, but different page structures. Examples of these documents include United States W2 forms, which share the same information, but may vary in appearance across companies. Neural models currently only support English text.
45
47
46
48
This table provides links to the build mode programming language SDK references and code samples on GitHub:
47
49
@@ -68,15 +70,15 @@ The table below compares custom template and custom neural features:
68
70
69
71
The following tools are supported by Form Recognizer v2.1:
Copy file name to clipboardExpand all lines: articles/applied-ai-services/form-recognizer/quickstarts/try-v3-csharp-sdk.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -152,7 +152,7 @@ Analyze and extract text, tables, structure, key-value pairs, and named entities
152
152
> * We've added the file URI value to the `Uri fileUri` variable at the top of the script.
153
153
> * For simplicity, all the entity fields that the service returns are not shown here. To see the list of all supported fields and corresponding types, see the [General document](../concept-general-document.md#named-entity-recognition-ner-categories) concept page.
154
154
155
-
### Add the following code to the Program.cs file:
155
+
**Add the following code sample to the Program.cs file:**
156
156
157
157
```csharp
158
158
usingAzure;
@@ -268,7 +268,7 @@ for (int i = 0; i < result.Tables.Count; i++)
268
268
### General document model output
269
269
270
270
Visit the Azure samples repository on GitHub to view the [general document model output](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/dotnet/FormRecognizer/v3-csharp-sdk-general-document-output.md).
271
-
271
+
___
272
272
273
273
## Layout model
274
274
@@ -280,7 +280,7 @@ Extract text, selection marks, text styles, table structures, and bounding regio
280
280
> * We've added the file URI value to the `Uri fileUri` variable at the top of the script.
281
281
> * To extract the layout from a given file at a URI, use the `StartAnalyzeDocumentFromUri` method and pass `prebuilt-layout` as the model ID. The returned value is an `AnalyzeResult` object containing data from the submitted document.
282
282
283
-
#### Add the following code to the Program.cs file:
283
+
**Add the following code sample to the Program.cs file:**
284
284
285
285
```csharp
286
286
usingAzure;
@@ -383,7 +383,7 @@ Analyze and extract common fields from specific document types using a prebuilt
383
383
> * To analyze a given file at a URI, use the `StartAnalyzeDocumentFromUri` method and pass `prebuilt-invoice` as the model ID. The returned value is an `AnalyzeResult` object containing data from the submitted document.
384
384
> * For simplicity, all the key-value pairs that the service returns are not shown here. To see the list of all supported fields and corresponding types, see our [Invoice](../concept-invoice.md#field-extraction) concept page.
385
385
386
-
#### Add the following code to your Program.cs file:
386
+
**Add the following code sample to your Program.cs file:**
0 commit comments