You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
22
+
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
21
23
> * The v3.0 Studio supports any model trained with v2.1 labeled data.
22
24
> * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
23
25
> **See* our [**REST API**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) or [**C#**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**Java**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**JavaScript**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), or [Python](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) SDK quickstarts to get started with the v3.0 version.
@@ -94,7 +96,7 @@ Follow these steps to create a new resource using the Azure portal:
94
96
95
97
### Continuous deployment
96
98
97
-
After you have created your web app, you can enable the continuous deployment option:
99
+
After you've created your web app, you can enable the continuous deployment option:
98
100
99
101
* From the left pane, choose **Container settings**.
100
102
* In the main window, navigate to Continuous deployment and toggle between the **On** and **Off** buttons to set your preference:
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
@@ -35,7 +37,7 @@ In this article, you'll use the Form Recognizer REST API with the Sample Labelin
35
37
36
38
* Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services)
37
39
* Once you have your Azure subscription, <ahref="https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer"title="Create a Form Recognizer resource"target="_blank">create a Form Recognizer resource </a> in the Azure portal to get your key and endpoint. After it deploys, select **Go to resource**.
38
-
* You'll need the key and endpoint from the resource you create to connect your application to the Form Recognizer API. You'll paste your key and endpoint into the code below later in the quickstart.
40
+
* You'll need the key and endpoint from the resource you create to connect your application to the Form Recognizer API. You'll paste your key and endpoint into the code later in the quickstart.
39
41
* You can use the free pricing tier (`F0`) to try the service, and upgrade later to a paid tier for production.
40
42
* A set of at least six forms of the same type. You'll use this data to train the model and test a form. You can use a [sample data set](https://go.microsoft.com/fwlink/?linkid=2090451) (download and extract *sample_data.zip*) for this quickstart. Upload the training files to the root of a blob storage container in a standard-performance-tier Azure Storage account.
41
43
@@ -69,7 +71,7 @@ You'll use the Docker engine to run the Sample Labeling tool. Follow these steps
Install Docker on your machine by following the appropriate instructions for your operating system:
75
77
@@ -89,7 +91,7 @@ You'll use the Docker engine to run the Sample Labeling tool. Follow these steps
89
91
docker run -it -p 3000:80 mcr.microsoft.com/azure-cognitive-services/custom-form/labeltool:latest-2.1 eula=accept
90
92
```
91
93
92
-
This command will make the Sample Labeling tool available through a web browser. Go to `http://localhost:3000`.
94
+
This command will make the sample-labeling tool available through a web browser. Go to `http://localhost:3000`.
93
95
94
96
> [!NOTE]
95
97
> You can also label documents and train models using the Form Recognizer REST API. To train and Analyze with the REST API, see [Train with labels using the REST API and Python](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/python/FormRecognizer/rest/python-labeled-data.md).
@@ -171,6 +173,7 @@ Next, you'll create tags (labels) and apply them to the text elements that you w
171
173
1. Press Enter to save the tag.
172
174
1. In the main editor, select words from the highlighted text elements or a region you drew in.
173
175
1. Select the tag you want to apply, or press the corresponding keyboard key. The number keys are assigned as hotkeys for the first 10 tags. You can reorder your tags using the up and down arrow icons in the tag editor pane.
176
+
1. Follow these steps to label at least five of your forms.
174
177
> [!Tip]
175
178
> Keep the following tips in mind when you're labeling your forms:
176
179
>
@@ -186,8 +189,6 @@ Next, you'll create tags (labels) and apply them to the text elements that you w
186
189
187
190
:::image type="content" source="media/label-tool/main-editor-2-1.png" alt-text="Main editor window of Sample Labeling tool.":::
188
191
189
-
Follow the steps above to label at least five of your forms.
190
-
191
192
### Specify tag value types
192
193
193
194
You can set the expected data type for each tag. Open the context menu to the right of a tag and select a type from the menu. This feature allows the detection algorithm to make assumptions that will improve the text-detection accuracy. It also ensures that the detected values will be returned in a standardized format in the final JSON output. Value type information is saved in the **fields.json** file in the same path as your label files.
@@ -262,7 +263,7 @@ Choose the Train icon on the left pane to open the Training page. Then select th
After training finishes, examine the **Average Accuracy** value. If it's low, you should add more input documents and repeat the steps above. The documents you've already labeled will remain in the project index.
266
+
After training finishes, examine the **Average Accuracy** value. If it's low, you should add more input documents and repeat the labeling steps. The documents you've already labeled will remain in the project index.
266
267
267
268
> [!TIP]
268
269
> You can also run the training process with a REST API call. To learn how to do this, see [Train with labels using Python](https://github.com/Azure-Samples/cognitive-services-quickstart-code/blob/master/python/FormRecognizer/rest/python-labeled-data.md).
@@ -271,14 +272,16 @@ After training finishes, examine the **Average Accuracy** value. If it's low, yo
271
272
272
273
With Model Compose, you can compose up to 100 models to a single model ID. When you call Analyze with the composed `modelID`, Form Recognizer will first classify the form you submitted, choose the best matching model, and then return results for that model. This operation is useful when incoming forms may belong to one of several templates.
273
274
274
-
To compose models in the Sample Labeling tool, select the Model Compose (merging arrow) icon on the left. On the left, select the models you wish to compose together. Models with the arrows icon are already composed models.
275
-
Choose the **Compose button**. In the pop-up, name your new composed model and select **Compose**. When the operation completes, your newly composed model should appear in the list.
275
+
* To compose models in the Sample Labeling tool, select the Model Compose (merging arrow) icon from the navigation bar.
276
+
* Select the models you wish to compose together. Models with the arrows icon are already composed models.
277
+
* Choose the **Compose button**. In the pop-up, name your new composed model and select **Compose**.
278
+
* When the operation completes, your newly composed model should appear in the list.
Select the Analyze (light bulb) icon on the left to test your model. Select source 'Local file'. Browse for a file and select a file from the sample dataset that you unzipped in the test folder. Then choose the **Run analysis** button to get key/value pairs, text and tables predictions for the form. The tool will apply tags in bounding boxes and will report the confidence of each tag.
284
+
Select the Analyze icon from the navigation bar to test your model. Select source 'Local file'. Browse for a file and select a file from the sample dataset that you unzipped in the test folder. Then choose the **Run analysis** button to get key/value pairs, text and tables predictions for the form. The tool will apply tags in bounding boxes and will report the confidence of each tag.
@@ -301,7 +304,7 @@ Go to your project settings page (slider icon) and take note of the security tok
301
304
302
305
### Restore project credentials
303
306
304
-
When you want to resume your project, you first need to create a connection to the same blob storage container. To do so, repeat the steps above. Then, go to the application settings page (gear icon) and see if your project's security token is there. If it isn't, add a new security token and copy over your token name and key from the previous step. Select **Save** to retain your settings.
307
+
When you want to resume your project, you first need to create a connection to the same blob storage container. To do so, repeat the steps. Then, go to the application settings page (gear icon) and see if your project's security token is there. If it isn't, add a new security token and copy over your token name and key from the previous step. Select **Save** to retain your settings.
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
26
+
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
28
27
> * The v3.0 Studio supports any model trained with v2.1 labeled data.
29
28
> * You can refer to the API migration guide for detailed information about migrating from v2.1 to v3.0.
30
29
> **See* our [**REST API**](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) or [**C#**](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**Java**](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**JavaScript**](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), or [Python](get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) SDK quickstarts to get started with the v3.0 version.
@@ -243,7 +242,7 @@ Use the tags editor pane to create a new tag you'd like to identify:
243
242
244
243
1. In the main editor, select the total value from the highlighted text elements.
245
244
246
-
1. Select the Total tag to apply to the value, or press the corresponding keyboard key. The number keys are assigned as hotkeys for the first 10 tags. You can reorder your tags using the up and down arrow icons in the tag editor pane.
245
+
1. Select the Total tag to apply to the value, or press the corresponding keyboard key. The number keys are assigned as hotkeys for the first 10 tags. You can reorder your tags using the up and down arrow icons in the tag editor pane. Follow these steps to label all five forms in the sample dataset:
247
246
248
247
> [!Tip]
249
248
> Keep the following tips in mind when you're labeling your forms:
@@ -259,8 +258,6 @@ Use the tags editor pane to create a new tag you'd like to identify:
259
258
> * To remove an applied tag without deleting the tag itself, select the tagged rectangle on the document view and press the delete key.
260
259
>
261
260
262
-
1. Continue to follow the steps above to label all five forms in the sample dataset.
263
-
264
261
:::image type="content" source="../media/label-tool/custom-1.jpg" alt-text="Label the samples.":::
265
262
266
263
#### Train a custom model
@@ -276,7 +273,7 @@ Choose the Train icon on the left pane to open the Training page. Then select th
276
273
277
274
#### Analyze a custom form
278
275
279
-
1. Select the **Analyze**(light bulb) icon on the left to test your model.
276
+
1. Select the **Analyze** icon from the navigation bar to test your model.
280
277
281
278
1. Select source **Local file** and browse for a file to select from the sample dataset that you unzipped in the test folder.
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
23
+
> * For an enhanced experience and advanced model quality, try the [Form Recognizer v3.0 Studio](https://formrecognizer.appliedai.azure.com/studio).
22
24
> * The v3.0 Studio supports any model trained with v2.1 labeled data.
23
25
> * You can refer to the [API migration guide](v3-migration-guide.md) for detailed information about migrating from v2.1 to v3.0.
24
26
> **See* our [**REST API**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) or [**C#**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**Java**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), [**JavaScript**](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true), or [Python](quickstarts/get-started-sdks-rest-api.md?view=form-recog-3.0.0&preserve-view=true) SDK quickstarts to get started with version v3.0.
0 commit comments