You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/document-intelligence/prebuilt/batch-analysis.md
+9Lines changed: 9 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -73,12 +73,14 @@ The batch API supports two options for specifying the files to be processed.
73
73
}
74
74
75
75
```
76
+
76
77
* If you don't want to process all the files in a container or folder, but rather specific files in that container or folder, use the ```azureBlobFileListSource``` object. This operation requires a File List JSONL file which lists the files to be processed. Store the JSONL file in the root folder of the container. Here's an example JSONL file with two files listed:
77
78
78
79
```json
79
80
{"file": "Adatum Corporation.pdf"}
80
81
{"file": "Best For You Organics Company.pdf"}
81
82
```
83
+
82
84
Use a file list `JSONL` file with the following conditions:
83
85
84
86
* When you need to process specific files instead of all files in a container;
@@ -98,8 +100,10 @@ Use a file list `JSONL` file with the following conditions:
98
100
}
99
101
100
102
```
103
+
101
104
A container URL or a container SAS URL is required in both options. Use container URL if using managed Identity to access your storage container. If you're using Shared Access Signature (SAS), use a SAS URL.
102
105
106
+
103
107
### 2. Specify the results location
104
108
105
109
* Specify the Azure Blob Storage container URL (or container SAS URL) for where you want your results to be stored using `resultContainerURL` parameter. We recommend using separate containers for source and results to prevent accidental overwriting.
@@ -128,8 +132,10 @@ POST /documentModels/{modelId}:analyzeBatch
128
132
}
129
133
130
134
```
135
+
131
136
This example shows a POST request with `azureBlobFileListSource` and a file list input
132
137
138
+
133
139
```bash
134
140
POST /documentModels/{modelId}:analyzeBatch
135
141
@@ -182,6 +188,7 @@ For each document processed, a status is assigned, either `succeeded`, `failed`,
182
188
183
189
Example of a `succeeded` status response:
184
190
191
+
185
192
```bash
186
193
{
187
194
"resultId": "myresultId-",
@@ -210,6 +217,7 @@ For each document processed, a status is assigned, either `succeeded`, `failed`,
210
217
}
211
218
}
212
219
```
220
+
213
221
* Status `failed`. This error is only returned if there are errors in the overall batch request. Once the batch analysis operation starts, the individual document operation status doesn't affect the status of the overall batch job, even if all the files have the status `failed`.
214
222
215
223
Example of a `failed` status response:
@@ -236,6 +244,7 @@ For each document processed, a status is assigned, either `succeeded`, `failed`,
236
244
]
237
245
...
238
246
```
247
+
239
248
* Status `skipped`: Typically, this status happens when output forthe document is already presentin the specified output folder and the `overwriteExisting` Boolean property is set to `false`
0 commit comments