Skip to content

Commit 70a7215

Browse files
authored
add batch api example to readme (#331)
1 parent 1b6f4b0 commit 70a7215

File tree

1 file changed

+27
-0
lines changed

1 file changed

+27
-0
lines changed

README.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -386,6 +386,33 @@ for model in models:
386386
print(model)
387387
```
388388

389+
### Batch Inference
390+
391+
The batch API allows you to submit larger inference jobs for completion with a 24 hour turn-around time, below is an example. To learn more refer to the [docs here](https://docs.together.ai/docs/batch-inference).
392+
393+
```python
394+
from together import Together
395+
396+
client = Together()
397+
398+
# Upload the batch file
399+
batch_file = client.files.upload(file="simpleqa_batch_student.jsonl", purpose="batch-api")
400+
401+
# Create the batch job
402+
batch = client.batches.create_batch(file_id=batch_file.id, endpoint="/v1/chat/completions")
403+
404+
# Monitor the batch status
405+
batch_stat = client.batches.get_batch(batch.id)
406+
407+
# List all batches - contains other batches as well
408+
client.batches.list_batches()
409+
410+
# Download the file content if job completed
411+
if batch_stat.status == 'COMPLETED':
412+
output_response = client.files.retrieve_content(id=batch_stat.output_file_id,
413+
output="simpleqa_v3_output.jsonl")
414+
```
415+
389416
## Usage – CLI
390417

391418
### Chat Completions

0 commit comments

Comments
 (0)