Skip to content

Commit 3e0075a

Browse files
committed
add Azure OpenAI Batch API demo
1 parent 8d58c3a commit 3e0075a

File tree

7 files changed

+219
-1
lines changed

7 files changed

+219
-1
lines changed

.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,4 @@ repos:
1313
rev: 24.2.0
1414
hooks:
1515
- id: black
16-
exclude: 'generated/.*|artifacts/.*'
16+
exclude: 'generated/.*|artifacts/.*|.jsonl'

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,4 +31,5 @@ Here are the preferred tools for development.
3131
| [5_streamlit_query_chat_history](./apps/5_streamlit_query_chat_history/README.md) | Search Chat History | ![5_streamlit_query_chat_history](./docs/images/5_streamlit_query_chat_history.main.png) |
3232
| [6_call_azure_ai_search](./apps/6_call_azure_ai_search/README.md) | Call Azure AI Search from Python | No Image |
3333
| [7_streamlit_chat_rag](./apps/7_streamlit_chat_rag/README.md) | Add RAG feature to Streamlit chat app | ![7_streamlit_chat_rag](./docs/images/7_streamlit_chat_rag.main.png) |
34+
| [8_streamlit_azure_openai_batch](./apps/8_streamlit_azure_openai_batch/README.md) | Call Azure OpenAI Batch API with Streamlit | ![8_streamlit_azure_openai_batch](./docs/images/8_streamlit_azure_openai_batch.main.png) |
3435
| [99_streamlit_examples](./apps/99_streamlit_examples/README.md) | Code samples for Streamlit | ![99_streamlit_examples](./docs/images/99_streamlit_examples.explaindata.png) |
Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# Call Azure OpenAI Batch API with Streamlit
2+
3+
This app demonstrates how to call Azure OpenAI Batch API with Streamlit.
4+
5+
## Prerequisites
6+
7+
- Python 3.10 or later
8+
- Azure OpenAI Service ([Global batch deployment](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/batch?tabs=standard-input&pivots=programming-language-python))
9+
10+
## Usage
11+
12+
1. Get Azure OpenAI Service API key
13+
1. Copy [.env.template](../../.env.template) to `.env` in the same directory
14+
1. Set credentials in `.env`
15+
1. Run [main.py](./main.py)
16+
17+
```shell
18+
# Create a virtual environment
19+
$ python -m venv .venv
20+
21+
# Activate the virtual environment
22+
$ source .venv/bin/activate
23+
24+
# Install dependencies
25+
$ pip install -r requirements.txt
26+
27+
# Run the script
28+
$ python -m streamlit run apps/8_streamlit_azure_openai_batch/main.py
29+
```
30+
31+
### Example
32+
33+
![Streamlit Chat](../../docs/images/8_streamlit_azure_openai_batch.main.png)
34+
35+
## References
36+
37+
- [Getting started with Azure OpenAI global batch deployments (preview)](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/batch?tabs=standard-input&pivots=programming-language-python)
Lines changed: 174 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,174 @@
1+
import base64
2+
import json
3+
from os import getenv
4+
5+
import streamlit as st
6+
from dotenv import load_dotenv
7+
from openai import AzureOpenAI
8+
9+
load_dotenv()
10+
11+
with st.sidebar:
12+
azure_openai_endpoint = st.text_input(
13+
label="AZURE_OPENAI_ENDPOINT",
14+
value=getenv("AZURE_OPENAI_ENDPOINT"),
15+
key="AZURE_OPENAI_ENDPOINT",
16+
type="default",
17+
)
18+
azure_openai_api_key = st.text_input(
19+
label="AZURE_OPENAI_API_KEY",
20+
# DEBUG
21+
value=getenv("AZURE_OPENAI_API_KEY"),
22+
key="AZURE_OPENAI_API_KEY",
23+
type="password",
24+
)
25+
azure_openai_api_version = st.text_input(
26+
label="AZURE_OPENAI_API_VERSION",
27+
value=getenv("AZURE_OPENAI_API_VERSION"),
28+
key="AZURE_OPENAI_API_VERSION",
29+
type="default",
30+
)
31+
azure_openai_gpt_model = st.text_input(
32+
label="AZURE_OPENAI_GPT_MODEL",
33+
value=getenv("AZURE_OPENAI_GPT_MODEL"),
34+
key="AZURE_OPENAI_GPT_MODEL",
35+
type="default",
36+
)
37+
"[Azure Portal](https://portal.azure.com/)"
38+
"[Azure OpenAI Studio](https://oai.azure.com/resource/overview)"
39+
"[View the source code](https://github.com/ks6088ts-labs/workshop-azure-openai/blob/main/apps/8_streamlit_azure_openai_batch/main.py)"
40+
41+
client = AzureOpenAI(
42+
api_key=azure_openai_api_key,
43+
api_version=azure_openai_api_version,
44+
azure_endpoint=azure_openai_endpoint,
45+
)
46+
47+
st.title("8_streamlit_azure_openai_batch")
48+
49+
if not azure_openai_api_key or not azure_openai_endpoint or not azure_openai_api_version or not azure_openai_gpt_model:
50+
st.warning("Please fill in the required fields at the sidebar.")
51+
st.stop()
52+
53+
# ---------------
54+
# Upload batch file
55+
# ---------------
56+
st.header("Upload batch file")
57+
st.info("Upload a file in JSON lines format (.jsonl)")
58+
uploaded_file = st.file_uploader("Upload an input file in JSON lines format", type=("jsonl"))
59+
if uploaded_file:
60+
bytes_data = uploaded_file.read()
61+
st.write(bytes_data.decode().split("\n"))
62+
submit_button = st.button("Submit", key="submit")
63+
if submit_button:
64+
temp_file_path = "tmp.jsonl"
65+
with open(temp_file_path, "wb") as f:
66+
f.write(bytes_data)
67+
with st.spinner("Uploading..."):
68+
try:
69+
response = client.files.create(
70+
# FIXME: hardcoded for now, use uploaded_file
71+
file=open(temp_file_path, "rb"),
72+
purpose="batch",
73+
)
74+
st.write(response.model_dump())
75+
except Exception as e:
76+
st.error(e)
77+
st.markdown("---")
78+
79+
# ---------------
80+
# Track file upload status
81+
# ---------------
82+
st.header("Track file upload status")
83+
st.info("Track the file upload status using the file ID.")
84+
track_file_id = st.text_input(
85+
label="File ID",
86+
key="track_file_id",
87+
help="Enter the file ID to track the file upload status",
88+
)
89+
track_button = st.button("Track")
90+
if track_file_id != "" and track_button:
91+
with st.spinner("Tracking..."):
92+
try:
93+
response = client.files.retrieve(track_file_id)
94+
st.write(response.model_dump())
95+
st.write(f"status: {response.status}")
96+
except Exception as e:
97+
st.error(e)
98+
st.markdown("---")
99+
100+
# ---------------
101+
# Create batch job
102+
# ---------------
103+
st.header("Create batch job")
104+
st.info("Create a batch job using the file ID")
105+
batch_file_id = st.text_input(
106+
label="File ID",
107+
key="batch_file_id",
108+
help="Enter the file ID to track the file upload status",
109+
)
110+
batch_button = st.button("Create batch job")
111+
if batch_file_id != "" and batch_button:
112+
with st.spinner("Creating..."):
113+
try:
114+
response = client.batches.create(
115+
input_file_id=batch_file_id,
116+
endpoint="/chat/completions",
117+
completion_window="24h",
118+
)
119+
st.write(response.model_dump())
120+
except Exception as e:
121+
st.error(e)
122+
st.markdown("---")
123+
124+
# ---------------
125+
# Track batch job progress
126+
# ---------------
127+
st.header("Track batch job progress")
128+
st.info("Track the batch job progress using the job ID")
129+
track_batch_job_id = st.text_input(
130+
label="Batch job ID",
131+
key="track_batch_job_id",
132+
help="Enter the batch job ID to track the job progress",
133+
)
134+
track_batch_job_button = st.button("Track batch job")
135+
if track_batch_job_id != "" and track_batch_job_button:
136+
with st.spinner("Tracking..."):
137+
try:
138+
response = client.batches.retrieve(track_batch_job_id)
139+
st.write(response.model_dump())
140+
st.write(f"status: {response.status}")
141+
st.write(f"output_file_id: {response.output_file_id}")
142+
except Exception as e:
143+
st.error(e)
144+
st.markdown("---")
145+
146+
# ---------------
147+
# Retrieve batch job output file
148+
# ---------------
149+
st.header("Retrieve batch job output file")
150+
st.info("Retrieve the batch job output file using the job ID")
151+
output_file_id = st.text_input(
152+
label="output_file_id",
153+
key="retrieve_batch_job_id",
154+
help="Enter the batch job ID to retrieve the output file",
155+
)
156+
retrieve_batch_job_button = st.button("Retrieve batch job output file")
157+
if output_file_id != "" and retrieve_batch_job_button:
158+
with st.spinner("Retrieving..."):
159+
try:
160+
file_response = client.files.content(output_file_id)
161+
raw_responses = file_response.text.strip().split("\n")
162+
163+
for raw_response in raw_responses:
164+
json_response = json.loads(raw_response)
165+
st.write(json_response)
166+
167+
output_encoded = base64.b64encode(json.dumps(json_response, indent=2).encode()).decode()
168+
# Generate a link to download the result
169+
st.markdown(
170+
f'<a href="data:file/txt;base64,{output_encoded}" download="{output_file_id}.json">Download Result</a>',
171+
unsafe_allow_html=True,
172+
)
173+
except Exception as e:
174+
st.error(e)
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{"custom_id": "task-0", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o", "messages": [{"role": "system", "content": "You are an AI assistant that helps people find information."}, {"role": "user", "content": "When was Microsoft founded?"}]}}
2+
{"custom_id": "task-1", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o", "messages": [{"role": "system", "content": "You are an AI assistant that helps people find information."}, {"role": "user", "content": "When was the first XBOX released?"}]}}
3+
{"custom_id": "task-2", "method": "POST", "url": "/chat/completions", "body": {"model": "gpt-4o", "messages": [{"role": "system", "content": "You are an AI assistant that helps people find information."}, {"role": "user", "content": "What is Altair Basic?"}]}}
105 KB
Loading

pyproject.toml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,9 @@ build-backend = "poetry.core.masonry.api"
3636
[tool.ruff]
3737
line-length = 120
3838
target-version = "py310"
39+
exclude = [
40+
".jsonl"
41+
]
3942

4043
[tool.ruff.lint]
4144
select = ["E", "F", "I", "UP"]

0 commit comments

Comments
 (0)