You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/batch.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -85,7 +85,7 @@ In the Studio UI the deployment type will appear as `Global-Batch`.
85
85
> [!TIP]
86
86
> Each line of your input file for batch processing has a `model` attribute that requires a global batch **deployment name**. For a given input file, all names must be the same deployment name. This is different from OpenAI where the concept of model deployments does not exist.
87
87
>
88
-
> For the best performance we recommend submitting large files for patch processing, rather than a large number of small files with only a few lines in each file.
88
+
> For the best performance we recommend submitting large files for batch processing, rather than a large number of small files with only a few lines in each file.
0 commit comments