Skip to content

Commit ac0ffe2

Browse files
curl fix
1 parent c1ff1e2 commit ac0ffe2

File tree

1 file changed

+11
-8
lines changed

1 file changed

+11
-8
lines changed

src/content/docs/workers-ai/features/async-batch-api.mdx

Lines changed: 11 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ sidebar:
55
order: 2
66
---
77

8-
import { Render, PackageManagers, WranglerConfig } from "~/components";
8+
import { Render, PackageManagers, WranglerConfig, CURL } from "~/components";
99

1010
This guide will walk you through the concepts behind asynchronous batch processing, explain why it matters, and show you how to create and deploy a Cloudflare Worker that leverages the Batch API with the AI binding and also working with REST API instead of a Cloudflare Worker.
1111

@@ -73,7 +73,7 @@ Send your initial batch inference request by composing a JSON payload containing
7373

7474
:::note[Note]
7575

76-
Ensure that the total payload is under 25 MB.
76+
Ensure that the total payload is under 10 MB.
7777

7878
:::
7979

@@ -268,12 +268,15 @@ If you prefer to work directly with the REST API instead of a Cloudflare Worker,
268268

269269
Make a POST request to the following endpoint:
270270

271-
```bash
272-
curl --request POST \
273-
--url "https://api.cloudflare.com/client/v4/accounts/<account-id>/ai/run/@cf/meta/ray-llama-3.3-70b-instruct-fp8-fast?queueRequest=true"
274-
--header "Authorization: <token>"
275-
--header "Content-Type: application/json"
276-
```
271+
<CURL
272+
url="https://api.cloudflare.com/client/v4/accounts/<account-id>/ai/run/@cf/meta/ray-llama-3.3-70b-instruct-fp8-fast?queueRequest=true"
273+
method="POST"
274+
headers={{
275+
Authorization: "<token>",
276+
"Content-Type": "application/json",
277+
}}
278+
code={{ mark: "value" }}
279+
/>
277280

278281
#### Request Payload Example
279282

0 commit comments

Comments
 (0)