Skip to content

Commit 40f7cc6

Browse files
committed
use includes for the quickstart etc
1 parent 211bbf9 commit 40f7cc6

File tree

4 files changed

+160
-101
lines changed

4 files changed

+160
-101
lines changed

articles/ai-services/openai/concepts/video-generation.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: pafarley
66
manager: nitinme
77
ms.service: azure-ai-openai
88
ms.topic: conceptual
9-
ms.date: 05/22/2025
9+
ms.date: 5/29/2025
1010
---
1111

1212
# Sora video generation (preview)
@@ -49,3 +49,6 @@ Sora has a robust safety stack including content filtering, abuse monitoring, se
4949

5050
Sora doesn't generate scenes with acts of violence but can generate adjacent content, such as realistic war-like footage.
5151

52+
## Related content
53+
- [Video generation quickstart](../video-generation/video-generation-quickstart.md)
54+
- [Image generation quickstart](../dall-e-quickstart.md)
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
---
2+
manager: nitinme
3+
author: eric-urban
4+
ms.author: eur
5+
ms.service: azure-ai-openai
6+
ms.topic: include
7+
ms.date: 5/29/2025
8+
---
9+
10+
In this quickstart, you generate video clips using the Azure OpenAI service. The example uses the Sora model, which is a video generation model that creates realistic and imaginative video scenes from text instructions. This guide shows you how to create a video generation job, poll for its status, and retrieve the generated video.
11+
12+
For more information on video generation, see [Video generation concepts](../concepts/video-generation.md).
Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
---
2+
manager: nitinme
3+
author: eric-urban
4+
ms.author: eur
5+
ms.service: azure-ai-openai
6+
ms.topic: include
7+
ms.date: 5/29/2025
8+
---
9+
10+
[!INCLUDE [Video generation introduction](video-generation-intro.md)]
11+
12+
## Prerequisites
13+
14+
- An Azure subscription. <a href="https://azure.microsoft.com/free/ai-services" target="_blank">Create one for free</a>.
15+
- <a href="https://www.python.org/" target="_blank">Python 3.8 or later version</a>. We recommend using Python 3.10 or later, but having at least Python 3.8 is required. If you don't have a suitable version of Python installed, you can follow the instructions in the [VS Code Python Tutorial](https://code.visualstudio.com/docs/python/python-tutorial#_install-a-python-interpreter) for the easiest way of installing Python on your operating system.
16+
- An Azure OpenAI resource created in one of the supported regions. For more information about region availability, see the [models and versions documentation](/azure/ai-services/openai/concepts/models#video-generation-models).
17+
- Then, you need to deploy a `sora` model with your Azure OpenAI resource. For more information, see [Create a resource and deploy a model with Azure OpenAI](../how-to/create-resource.md).
18+
19+
## Microsoft Entra ID prerequisites
20+
21+
For the recommended keyless authentication with Microsoft Entra ID, you need to:
22+
- Install the [Azure CLI](/cli/azure/install-azure-cli) used for keyless authentication with Microsoft Entra ID.
23+
- Assign the `Cognitive Services User` role to your user account. You can assign roles in the Azure portal under **Access control (IAM)** > **Add role assignment**.
24+
25+
## Set up
26+
27+
1. Create a new folder `video-generation-quickstart` and go to the quickstart folder with the following command:
28+
29+
```shell
30+
mkdir video-generation-quickstart && cd video-generation-quickstart
31+
```
32+
33+
1. Create a virtual environment. If you already have Python 3.10 or higher installed, you can create a virtual environment using the following commands:
34+
35+
# [Windows](#tab/windows)
36+
37+
```bash
38+
py -3 -m venv .venv
39+
.venv\scripts\activate
40+
```
41+
42+
# [Linux](#tab/linux)
43+
44+
```bash
45+
python3 -m venv .venv
46+
source .venv/bin/activate
47+
```
48+
49+
# [macOS](#tab/macos)
50+
51+
```bash
52+
python3 -m venv .venv
53+
source .venv/bin/activate
54+
```
55+
56+
---
57+
58+
Activating the Python environment means that when you run ```python``` or ```pip``` from the command line, you then use the Python interpreter contained in the ```.venv``` folder of your application. You can use the ```deactivate``` command to exit the python virtual environment, and can later reactivate it when needed.
59+
60+
> [!TIP]
61+
> We recommend that you create and activate a new Python environment to use to install the packages you need for this tutorial. Don't install packages into your global python installation. You should always use a virtual or conda environment when installing python packages, otherwise you can break your global installation of Python.
62+
63+
1. For the **recommended** keyless authentication with Microsoft Entra ID, install the `azure-identity` package with:
64+
65+
```console
66+
pip install azure-identity
67+
```
68+
69+
70+
## Retrieve resource information
71+
72+
[!INCLUDE [resource authentication](resource-authentication.md)]
73+
74+
75+
## Generate audio from text input
76+
77+
## [Microsoft Entra ID](#tab/keyless)
78+
79+
1. Create the `sora-quickstart.py` file with the following code:
80+
81+
```python
82+
import requests
83+
import base64
84+
import os
85+
from azure.identity import DefaultAzureCredential
86+
87+
# Set environment variables or edit the corresponding values here.
88+
endpoint = os.environ['AZURE_OPENAI_ENDPOINT']
89+
90+
# Keyless authentication
91+
credential = DefaultAzureCredential()
92+
token = credential.get_token("https://cognitiveservices.azure.com/.default")
93+
94+
api_version = '2025-01-01-preview'
95+
url = f"{endpoint}/openai/deployments/gpt-4o-mini-audio-preview/chat/completions?api-version={api_version}"
96+
headers= { "Authorization": f"Bearer {token.token}", "Content-Type": "application/json" }
97+
```
98+
99+
1. Run the Python file.
100+
101+
```shell
102+
python sora-quickstart.py
103+
```
104+
105+
## [API key](#tab/api-key)
106+
107+
1. Create the `sora-quickstart.py` file with the following code:
108+
109+
```python
110+
import requests
111+
import base64
112+
import os
113+
114+
# Set environment variables or edit the corresponding values here.
115+
endpoint = os.environ['AZURE_OPENAI_ENDPOINT']
116+
api_key = os.environ['AZURE_OPENAI_API_KEY']
117+
118+
api_version = '2025-01-01-preview'
119+
url = f"{endpoint}/openai/deployments/gpt-4o-mini-audio-preview/chat/completions?api-version={api_version}"
120+
headers= { "api-key": api_key, "Content-Type": "application/json" }
121+
```
122+
123+
1. Run the Python file.
124+
125+
```shell
126+
python sora-quickstart.py
127+
```
128+
129+
---
130+
131+
Wait a few moments to get the response.
132+
133+
### Output
134+
135+

articles/ai-services/openai/video-generation-quickstart.md

Lines changed: 9 additions & 100 deletions
Original file line numberDiff line numberDiff line change
@@ -7,112 +7,21 @@ ms.service: azure-ai-openai
77
ms.topic: quickstart
88
author: PatrickFarley
99
ms.author: pafarley
10-
ms.date: 05/22/2025
10+
ms.date: 5/29/2025
1111
---
1212

1313
# Quickstart: Generate a video with Sora (preview)
1414

15-
In this quickstart, you generate video clips using the Azure OpenAI service. The example uses the Sora model, which is a video generation model that creates realistic and imaginative video scenes from text instructions. This guide shows you how to create a video generation job, poll for its status, and retrieve the generated video.
15+
[!INCLUDE [REST API quickstart](includes/audio-completions-rest.md)]
1616

17-
For more information on video generation, see [Video generation concepts](./concepts/video-generation.md).
17+
## Clean-up resources
1818

19+
If you want to clean up and remove an Azure OpenAI resource, you can delete the resource. Before deleting the resource, you must first delete any deployed models.
1920

20-
## Prerequisites
21+
- [Azure portal](../multi-service-resource.md?pivots=azportal#clean-up-resources)
22+
- [Azure CLI](../multi-service-resource.md?pivots=azcli#clean-up-resources)
2123

22-
- An Azure subscription. <a href="https://azure.microsoft.com/free/ai-services" target="_blank">Create one for free</a>.
23-
- <a href="https://www.python.org/" target="_blank">Python 3.8 or later version</a>.
24-
- An Azure OpenAI resource created in a supported region. See [Region availability](/azure/ai-services/openai/concepts/models#model-summary-table-and-region-availability).
25-
- Then, you need to deploy a `sora` model with your Azure resource. For more information, see [Create a resource and deploy a model with Azure OpenAI](./how-to/create-resource.md).
26-
- [Python 3.8 or later version](https://www.python.org/).
24+
## Related content
2725

28-
29-
## Setup
30-
31-
### Retrieve key and endpoint
32-
33-
To successfully call the Azure OpenAI APIs, you need the following information about your Azure OpenAI resource:
34-
35-
| Variable | Name | Value |
36-
|---|---|---|
37-
| **Endpoint** | `api_base` | The endpoint value is located under **Keys and Endpoint** for your resource in the Azure portal. You can also find the endpoint via the **Deployments** page in Azure AI Foundry portal. An example endpoint is: `https://docs-test-001.openai.azure.com/`. |
38-
| **Key** | `api_key` | The key value is also located under **Keys and Endpoint** for your resource in the Azure portal. Azure generates two keys for your resource. You can use either value. |
39-
40-
Go to your resource in the Azure portal. On the navigation pane, select **Keys and Endpoint** under **Resource Management**. Copy the **Endpoint** value and an access key value. You can use either the **KEY 1** or **KEY 2** value. Always having two keys allows you to securely rotate and regenerate keys without causing a service disruption.
41-
42-
:::image type="content" source="./media/quickstarts/endpoint.png" alt-text="Screenshot that shows the Keys and Endpoint page for an Azure OpenAI resource in the Azure portal." lightbox="./media/quickstarts/endpoint.png":::
43-
44-
45-
46-
[!INCLUDE [environment-variables](./includes/environment-variables.md)]
47-
48-
49-
50-
## Create a new Python application
51-
52-
Create a new Python file named `quickstart.py`. Open the new file in your preferred editor or IDE.
53-
1. Replace the contents of `quickstart.py` with the following code. Change the value of `prompt` to your preferred text.
54-
55-
```python
56-
import os
57-
import time
58-
import requests
59-
60-
# Set these variables with your values
61-
endpoint = os.environ["AZURE_OPENAI_ENDPOINT"] # e.g., "https://docs-test-001.openai.azure.com"
62-
api_key = os.environ["AZURE_OPENAI_KEY"]
63-
access_token = os.environ.get("AZURE_OPENAI_TOKEN") # Optional: if using Azure AD auth
64-
65-
headers = {
66-
"Content-Type": "application/json",
67-
"api-key": api_key,
68-
}
69-
if access_token:
70-
headers["Authorization"] = f"Bearer {access_token}"
71-
72-
# 1. Create a video generation job
73-
create_url = f"{endpoint}/openai/v1/video/generations/jobs?api-version=preview"
74-
payload = {
75-
"prompt": "A cat playing piano in a jazz bar.",
76-
"model": "sora",
77-
"width": 1080,
78-
"height": 1080,
79-
"n_seconds": 10
80-
}
81-
response = requests.post(create_url, headers=headers, json=payload)
82-
response.raise_for_status()
83-
job_id = response.json()["body"]["id"]
84-
print(f"Job created: {job_id}")
85-
86-
# 2. Poll for job status
87-
status_url = f"{endpoint}/openai/v1/video/generations/jobs/{job_id}?api-version=preview"
88-
while True:
89-
status_response = requests.get(status_url, headers=headers)
90-
status_response.raise_for_status()
91-
status = status_response.json()["body"]["status"]
92-
print(f"Job status: {status}")
93-
if status == "succeeded":
94-
generations = status_response.json()["body"].get("generations", [])
95-
if not generations:
96-
raise Exception("No generations found in job result.")
97-
generation_id = generations[0]["id"]
98-
break
99-
elif status in ("failed", "cancelled"):
100-
raise Exception(f"Job did not succeed. Status: {status}")
101-
time.sleep(5) # Wait before polling again
102-
103-
# 3. Retrieve the generated video
104-
get_video_url = f"{endpoint}/openai/v1/video/generations/{generation_id}?api-version=preview"
105-
video_response = requests.get(get_video_url, headers=headers)
106-
video_response.raise_for_status()
107-
download_url = video_response.json()["body"]["generations"]
108-
print(f"Download your video at: {download_url}")
109-
```
110-
1. Run the application with the `python` command:
111-
112-
```console
113-
python quickstart.py
114-
```
115-
116-
Wait a few moments to get the response.
117-
118-
---
26+
* Learn more about Azure OpenAI [deployment types](./how-to/deployment-types.md).
27+
* Learn more about Azure OpenAI [quotas and limits](quotas-limits.md).

0 commit comments

Comments
 (0)