|
| 1 | +--- |
| 2 | +title: How to migrate to OpenAI Python v1.x |
| 3 | +titleSuffix: Azure OpenAI Service |
| 4 | +description: Learn about migrating to the latest release of the OpenAI Python library with Azure OpenAI |
| 5 | +author: mrbullwinkle |
| 6 | +ms.author: mbullwin |
| 7 | +ms.service: azure-ai-openai |
| 8 | +ms.custom: |
| 9 | +ms.topic: how-to |
| 10 | +ms.date: 11/06/2023 |
| 11 | +manager: nitinme |
| 12 | +--- |
| 13 | + |
| 14 | +# Migrating to the OpenAI Python API library 1.x |
| 15 | + |
| 16 | +OpenAI has just released a new version of the [OpenAI Python API library](https://github.com/openai/openai-python/). This guide is supplemental to [OpenAI's migration guide](https://github.com/openai/openai-python/discussions/631) and will help bring you up to speed on the changes specific to Azure OpenAI. |
| 17 | + |
| 18 | +## Updates |
| 19 | + |
| 20 | +- This is a completely new version of the OpenAI Python API library. |
| 21 | +- Starting on November 6, 2023 `pip install openai` and `pip install openai --upgrade` will install `version 1.x` of the OpenAI Python library. |
| 22 | +- Upgrading from `version 0.28.1` to `version 1.x` is a breaking change, you'll need to test and update your code. |
| 23 | +- Auto-retry with backoff if there's an error |
| 24 | +- Proper types (for mypy/pyright/editors) |
| 25 | +- You can now instantiate a client, instead of using a global default. |
| 26 | +- Switch to explicit client instantiation |
| 27 | +- [Name changes](#name-changes) |
| 28 | + |
| 29 | +## Known issues |
| 30 | + |
| 31 | +- The latest release of the [OpenAI Python library](https://pypi.org/project/openai/) doesn't currently support DALL-E when used with Azure OpenAI. DALL-E with Azure OpenAI is still supported with `0.28.1`. |
| 32 | +- `embeddings_utils.py` which was used to provide functionality like cosine similarity for semantic text search is [no longer part of the OpenAI Python API library](https://github.com/openai/openai-python/issues/676). |
| 33 | +- You should also check the active [GitHub Issues](https://github.com/openai/openai-python/issues/703) for the OpenAI Python library. |
| 34 | + |
| 35 | +## Test before you migrate |
| 36 | + |
| 37 | +> [!IMPORTANT] |
| 38 | +> Automatic migration of your code using `openai migrate` is not supported with Azure OpenAI. |
| 39 | +
|
| 40 | +As this is a new version of the library with breaking changes, you should test your code extensively against the new release before migrating any production applications to rely on version 1.x. You should also review your code and internal processes to make sure that you're following best practices and pinning your production code to only versions that you have fully tested. |
| 41 | + |
| 42 | +To make the migration process easier, we're updating existing code examples in our docs for Python to a tabbed experience: |
| 43 | + |
| 44 | +# [OpenAI Python 0.28.1](#tab/python) |
| 45 | + |
| 46 | +```console |
| 47 | +pip install openai==0.28.1 |
| 48 | +``` |
| 49 | + |
| 50 | +# [OpenAI Python 1.x](#tab/python-new) |
| 51 | + |
| 52 | +```console |
| 53 | +pip install openai --upgrade |
| 54 | +``` |
| 55 | + |
| 56 | +--- |
| 57 | + |
| 58 | +This provides context for what has changed and allows you to test the new library in parallel while continuing to provide support for version `0.28.1`. If you upgrade to `1.x` and realize you need to temporarily revert back to the previous version, you can always `pip uninstall openai` and then reinstall targeted to `0.28.1` with `pip install openai==0.28.1`. |
| 59 | + |
| 60 | +## Chat completions |
| 61 | + |
| 62 | +# [OpenAI Python 0.28.1](#tab/python) |
| 63 | + |
| 64 | +You need to set the `engine` variable to the deployment name you chose when you deployed the GPT-3.5-Turbo or GPT-4 models. Entering the model name will result in an error unless you chose a deployment name that is identical to the underlying model name. |
| 65 | + |
| 66 | +```python |
| 67 | +import os |
| 68 | +import openai |
| 69 | +openai.api_type = "azure" |
| 70 | +openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT") |
| 71 | +openai.api_key = os.getenv("AZURE_OPENAI_KEY") |
| 72 | +openai.api_version = "2023-05-15" |
| 73 | + |
| 74 | +response = openai.ChatCompletion.create( |
| 75 | + engine="gpt-35-turbo", # engine = "deployment_name". |
| 76 | + messages=[ |
| 77 | + {"role": "system", "content": "You are a helpful assistant."}, |
| 78 | + {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}, |
| 79 | + {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."}, |
| 80 | + {"role": "user", "content": "Do other Azure AI services support this too?"} |
| 81 | + ] |
| 82 | +) |
| 83 | + |
| 84 | +print(response) |
| 85 | +print(response['choices'][0]['message']['content']) |
| 86 | +``` |
| 87 | + |
| 88 | +# [OpenAI Python 1.x](#tab/python-new) |
| 89 | + |
| 90 | +You need to set the `model` variable to the deployment name you chose when you deployed the GPT-3.5-Turbo or GPT-4 models. Entering the model name results in an error unless you chose a deployment name that is identical to the underlying model name. |
| 91 | + |
| 92 | +```python |
| 93 | +import os |
| 94 | +from openai import AzureOpenAI |
| 95 | + |
| 96 | +client = AzureOpenAI( |
| 97 | + azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"), |
| 98 | + api_key=os.getenv("AZURE_OPENAI_KEY"), |
| 99 | + api_version="2023-05-15" |
| 100 | +) |
| 101 | + |
| 102 | +response = client.chat.completions.create( |
| 103 | + model="gpt-35-turbo", # model = "deployment_name". |
| 104 | + messages=[ |
| 105 | + {"role": "system", "content": "You are a helpful assistant."}, |
| 106 | + {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}, |
| 107 | + {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."}, |
| 108 | + {"role": "user", "content": "Do other Azure AI services support this too?"} |
| 109 | + ] |
| 110 | +) |
| 111 | + |
| 112 | +print(response.choices[0].message.content) |
| 113 | +``` |
| 114 | + |
| 115 | +Additional examples can be found in our [in-depth Chat Completion article](chatgpt.md). |
| 116 | + |
| 117 | +--- |
| 118 | + |
| 119 | +## Completions |
| 120 | + |
| 121 | +# [OpenAI Python 0.28.1](#tab/python) |
| 122 | + |
| 123 | +```python |
| 124 | +import os |
| 125 | +import openai |
| 126 | + |
| 127 | +openai.api_key = os.getenv("AZURE_OPENAI_KEY") |
| 128 | +openai.api_base = os.getenv("AZURE_OPENAI_ENDPOINT") # your endpoint should look like the following https://YOUR_RESOURCE_NAME.openai.azure.com/ |
| 129 | +openai.api_type = 'azure' |
| 130 | +openai.api_version = '2023-05-15' # this might change in the future |
| 131 | + |
| 132 | +deployment_name='REPLACE_WITH_YOUR_DEPLOYMENT_NAME' #This will correspond to the custom name you chose for your deployment when you deployed a model. |
| 133 | + |
| 134 | +# Send a completion call to generate an answer |
| 135 | +print('Sending a test completion job') |
| 136 | +start_phrase = 'Write a tagline for an ice cream shop. ' |
| 137 | +response = openai.Completion.create(engine=deployment_name, prompt=start_phrase, max_tokens=10) |
| 138 | +text = response['choices'][0]['text'].replace('\n', '').replace(' .', '.').strip() |
| 139 | +print(start_phrase+text) |
| 140 | +``` |
| 141 | + |
| 142 | +# [OpenAI Python 1.x](#tab/python-new) |
| 143 | + |
| 144 | +```python |
| 145 | +import os |
| 146 | +from openai import AzureOpenAI |
| 147 | + |
| 148 | +client = AzureOpenAI( |
| 149 | + api_key=os.getenv("AZURE_OPENAI_KEY"), |
| 150 | + api_version="2023-10-01-preview", |
| 151 | + azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT") |
| 152 | + ) |
| 153 | + |
| 154 | +deployment_name='REPLACE_WITH_YOUR_DEPLOYMENT_NAME' #This will correspond to the custom name you chose for your deployment when you deployed a model. |
| 155 | + |
| 156 | +# Send a completion call to generate an answer |
| 157 | +print('Sending a test completion job') |
| 158 | +start_phrase = 'Write a tagline for an ice cream shop. ' |
| 159 | +response = client.completions.create(model=deployment_name, prompt=start_phrase, max_tokens=10) |
| 160 | +print(response.choices[0].text) |
| 161 | +``` |
| 162 | + |
| 163 | +--- |
| 164 | + |
| 165 | +## Embeddings |
| 166 | + |
| 167 | +# [OpenAI Python 0.28.1](#tab/python) |
| 168 | + |
| 169 | +```python |
| 170 | +import openai |
| 171 | + |
| 172 | +openai.api_type = "azure" |
| 173 | +openai.api_key = YOUR_API_KEY |
| 174 | +openai.api_base = "https://YOUR_RESOURCE_NAME.openai.azure.com" |
| 175 | +openai.api_version = "2023-05-15" |
| 176 | + |
| 177 | +response = openai.Embedding.create( |
| 178 | + input="Your text string goes here", |
| 179 | + engine="YOUR_DEPLOYMENT_NAME" |
| 180 | +) |
| 181 | +embeddings = response['data'][0]['embedding'] |
| 182 | +print(embeddings) |
| 183 | +``` |
| 184 | + |
| 185 | +# [OpenAI Python 1.x](#tab/python-new) |
| 186 | + |
| 187 | +```python |
| 188 | +import os |
| 189 | +from openai import AzureOpenAI |
| 190 | + |
| 191 | +client = AzureOpenAI( |
| 192 | + api_key = os.getenv("AZURE_OPENAI_KEY"), |
| 193 | + api_version = "2023-05-15", |
| 194 | + azure_endpoint =os.getenv("AZURE_OPENAI_ENDPOINT") |
| 195 | +) |
| 196 | + |
| 197 | +response = client.embeddings.create( |
| 198 | + input = "Your text string goes here", |
| 199 | + model= "text-embedding-ada-002" |
| 200 | +) |
| 201 | + |
| 202 | +print(response.model_dump_json(indent=2)) |
| 203 | +``` |
| 204 | + |
| 205 | +Additional examples including how to handle semantic text search without `embeddings_utils.py` can be found in our [embeddings tutorial](../tutorials/embeddings.md). |
| 206 | + |
| 207 | +--- |
| 208 | + |
| 209 | +## Authentication |
| 210 | + |
| 211 | +```python |
| 212 | +from azure.identity import DefaultAzureCredential, get_bearer_token_provider |
| 213 | +from openai import AzureOpenAI |
| 214 | + |
| 215 | +token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default") |
| 216 | + |
| 217 | +api_version = "2023-10-01-preview" |
| 218 | +endpoint = "https://my-resource.openai.azure.com" |
| 219 | + |
| 220 | +client = AzureOpenAI( |
| 221 | + api_version=api_version, |
| 222 | + azure_endpoint=endpoint, |
| 223 | + azure_ad_token_provider=token_provider, |
| 224 | +) |
| 225 | + |
| 226 | +completion = client.chat.completions.create( |
| 227 | + model="deployment-name", # gpt-35-instant |
| 228 | + messages=[ |
| 229 | + { |
| 230 | + "role": "user", |
| 231 | + "content": "How do I output all files in a directory using Python?", |
| 232 | + }, |
| 233 | + ], |
| 234 | +) |
| 235 | +print(completion.model_dump_json(indent=2)) |
| 236 | +``` |
| 237 | + |
| 238 | +## Name changes |
| 239 | + |
| 240 | +> [!NOTE] |
| 241 | +> All a* methods have been removed; the async client must be used instead. |
| 242 | +
|
| 243 | +| OpenAI Python 0.28.1 | OpenAI Python 1.x | |
| 244 | +| --------------- | --------------- | |
| 245 | +| `openai.api_base` | `openai.base_url` | |
| 246 | +| `openai.proxy` | `openai.proxies (docs)` | |
| 247 | +| `openai.InvalidRequestError` | `openai.BadRequestError` | |
| 248 | +| `openai.Audio.transcribe()` | `client.audio.transcriptions.create()` | |
| 249 | +| `openai.Audio.translate()` | `client.audio.translations.create()` | |
| 250 | +| `openai.ChatCompletion.create()` | `client.chat.completions.create()` | |
| 251 | +| `openai.Completion.create()` | `client.completions.create()` | |
| 252 | +| `openai.Edit.create()` | `client.edits.create()` | |
| 253 | +| `openai.Embedding.create()` | `client.embeddings.create()` | |
| 254 | +| `openai.File.create()` | `client.files.create()` | |
| 255 | +| `openai.File.list()` | `client.files.list()` | |
| 256 | +| `openai.File.retrieve()` | `client.files.retrieve()` | |
| 257 | +| `openai.File.download()` | `client.files.retrieve_content()` | |
| 258 | +| `openai.FineTune.cancel()` | `client.fine_tunes.cancel()` | |
| 259 | +| `openai.FineTune.list()` | `client.fine_tunes.list()` | |
| 260 | +| `openai.FineTune.list_events()` | `client.fine_tunes.list_events()` | |
| 261 | +| `openai.FineTune.stream_events()` | `client.fine_tunes.list_events(stream=True)` | |
| 262 | +| `openai.FineTune.retrieve()` | `client.fine_tunes.retrieve()` | |
| 263 | +| `openai.FineTune.delete()` | `client.fine_tunes.delete()` | |
| 264 | +| `openai.FineTune.create()` | `client.fine_tunes.create()` | |
| 265 | +| `openai.FineTuningJob.create()` | `client.fine_tuning.jobs.create()` | |
| 266 | +| `openai.FineTuningJob.cancel()` | `client.fine_tuning.jobs.cancel()` | |
| 267 | +| `openai.FineTuningJob.delete()` | `client.fine_tuning.jobs.create()` | |
| 268 | +| `openai.FineTuningJob.retrieve()` | `client.fine_tuning.jobs.retrieve()` | |
| 269 | +| `openai.FineTuningJob.list()` | `client.fine_tuning.jobs.list()` | |
| 270 | +| `openai.FineTuningJob.list_events()` | `client.fine_tuning.jobs.list_events()` | |
| 271 | +| `openai.Image.create()` | `client.images.generate()` | |
| 272 | +| `openai.Image.create_variation()` | `client.images.create_variation()` | |
| 273 | +| `openai.Image.create_edit()` | `client.images.edit()` | |
| 274 | +| `openai.Model.list()` | `client.models.list()` | |
| 275 | +| `openai.Model.delete()` | `client.models.delete()` | |
| 276 | +| `openai.Model.retrieve()` | `client.models.retrieve()` | |
| 277 | +| `openai.Moderation.create()` | `client.moderations.create()` | |
| 278 | +| `openai.api_resources` | `openai.resources` | |
| 279 | + |
| 280 | +### Removed |
| 281 | + |
| 282 | +`openai.api_key_path` |
| 283 | +`openai.app_info` |
| 284 | +`openai.debug` |
| 285 | +`openai.log` |
| 286 | +`openai.OpenAIError` |
| 287 | +`openai.Audio.transcribe_raw()` |
| 288 | +`openai.Audio.translate_raw()` |
| 289 | +`openai.ErrorObject` |
| 290 | +`openai.Customer` |
| 291 | +`openai.api_version` |
| 292 | +`openai.verify_ssl_certs` |
| 293 | +`openai.api_type` |
| 294 | +`openai.enable_telemetry` |
| 295 | +`openai.ca_bundle_path` |
| 296 | +`openai.requestssession` (OpenAI now uses `httpx`) |
| 297 | +`openai.aiosession` (OpenAI now uses `httpx`) |
| 298 | +`openai.Deployment` (Previously used for Azure OpenAI) |
| 299 | +`openai.Engine` |
| 300 | +`openai.File.find_matching_files()` |
0 commit comments