Skip to content

Commit 2259ce7

Browse files
jr-MSmrbullwinkle
andauthored
Update articles/cognitive-services/Anomaly-Detector/How-to/batch-inference.md
Co-authored-by: Michael <[email protected]>
1 parent 3985da0 commit 2259ce7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/cognitive-services/Anomaly-Detector/How-to/batch-inference.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ You could choose the batch inference API, or the streaming inference API for det
3030

3131
To perform batch inference, provide the blob URL containing the inference data, the start time, and end time. For inference data volume, at least `1 sliding window` length and at most **20000** timestamps.
3232

33-
To get a better performance, we recommend you to send out no more than 150,000 data points for a batch inference. *(Data points = Number of variables * Number of timestamps)*
33+
To get better performance, we recommend you send out no more than 150,000 data points per batch inference. *(Data points = Number of variables * Number of timestamps)*
3434

3535
This inference is asynchronous, so the results aren't returned immediately. Notice that you need to save in a variable the link of the results in the **response header** which contains the `resultId`, so that you may know where to get the results afterwards.
3636

0 commit comments

Comments
 (0)