You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: PACKAGE_README.md
+64-2Lines changed: 64 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -79,7 +79,9 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
79
79
*Importing `httpx` lets us use `httpx.Client` and `httpx.AsyncClient`. This avoids having to update openai to at least
80
80
[1.17.0](https://github.com/openai/openai-python/releases/tag/v1.17.0). The `openai` properties for `DefaultHttpxClient` and `DefaultAsyncHttpxClient` are mere wrappers for `httpx.Client` and `httpx.AsyncClient`.*
81
81
82
-
### Configuring the Backends and Load Balancer
82
+
### Configuring the Backends and Load Balancer with a Token Provider
83
+
84
+
**We strongly recommend the use of a managed identity in Azure and the use of the `AzureDefaultCredential` locally.** This section details that approach.
83
85
84
86
1. Define a list of backends according to the *Load Balancer Backend Configuration* section below.
85
87
@@ -88,8 +90,8 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
88
90
```python
89
91
backends: List[Backend] = [
90
92
Backend("oai-eastus.openai.azure.com", 1),
91
-
Backend("oai-eastus.openai.azure.com", 1, "/ai"),
92
93
Backend("oai-southcentralus.openai.azure.com", 1)
94
+
Backend("oai-westus.openai.azure.com", 1, "/ai"),
93
95
]
94
96
```
95
97
@@ -121,6 +123,53 @@ from openai_priority_loadbalancer import AsyncLoadBalancer, Backend
121
123
)
122
124
```
123
125
126
+
### Configuring the Backends and Load Balancer with individual Azure OpenAI API Keys
127
+
128
+
It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, requiring maintenance, environment-specific keys, key rotations, etc.
129
+
However, if you do need to use keys, it is possible to set them for each Azure OpenAI backend starting with release `1.1.0`.
130
+
131
+
When a backend's `api_key` property is set, the `api-key` header will be replaced with the `<api_key>` value prior to sending the request to the corresponding Azure OpenAI instance. Please see below for examples.
132
+
133
+
1. Define a list of backends according to the *Load Balancer Backend Configuration* section below. This includes the API key as the last parameter (below values are mock placeholders).
134
+
135
+
*Optionally, a path can be added (e.g. `"/ai"`), which gets prepended to the request path. This is an extraordinary, not a commonly needed functionality.*
1. Instantiate the load balancer and inject a new httpx client with the load balancer as the new transport.
146
+
147
+
**Synchronous**
148
+
149
+
```python
150
+
lb = LoadBalancer(backends)
151
+
152
+
client = AzureOpenAI(
153
+
azure_endpoint=f"https://{backends[0].host}", # Must be seeded, so we use the first host. It will get overwritten by the load balancer.
154
+
api_key="obtain_from_load_balancer", # the value is not used, but it must be set
155
+
api_version="2024-04-01-preview",
156
+
http_client= httpx.Client(transport= lb) # Inject the synchronous load balancer as the transport in a new default httpx client.
157
+
)
158
+
```
159
+
160
+
**Asynchronous**
161
+
162
+
```python
163
+
lb = AsyncLoadBalancer(backends)
164
+
165
+
client = AsyncAzureOpenAI(
166
+
azure_endpoint=f"https://{backends[0].host}", # Must be seeded, so we use the first host. It will get overwritten by the load balancer.
167
+
api_key="obtain_from_load_balancer", # the value is not used, but it must be set
168
+
api_version="2024-04-01-preview",
169
+
http_client= httpx.AsyncClient(transport= lb) # Inject the asynchronous load balancer as the transport in a new default async httpx client.
170
+
)
171
+
```
172
+
124
173
## Load Balancer Backend Configuration
125
174
126
175
At its core, the Load Balancer Backend configuration requires one or more backend hosts and a numeric priority starting at 1. Please take note that you define a host, not a URL.
@@ -191,6 +240,19 @@ backends = [
191
240
]
192
241
```
193
242
243
+
### Backend Authentication
244
+
245
+
While we strongly recommend the use of managed identities, it is possible to use the Azure OpenAI API keys for each respective Azure OpenAI instance. Note that you are solely responsible for the safeguarding and injection of these keys.
As these are the only changes to the [OpenAI Python API library](https://github.com/openai/openai-python) implementation, simply execute your python code.
Copy file name to clipboardExpand all lines: README.md
+26-2Lines changed: 26 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,15 @@ It's also good to have some knowledge of authentication and identities.
38
38
39
39
## Authentication
40
40
41
-
Locally, you can log into Azure via the CLI and the steps below and use the `AzureDefaultCredential` (what I use in my example). When deploying this application in Azure, it's recommended to use a managed identity for authentication. It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, which would probably require expanding upon the `Backends` class. Best to just avoid keys.
41
+
**We strongly recommend the use of a managed identity in Azure and the use of the `AzureDefaultCredential` locally.**
42
+
43
+
Locally, you can log into Azure via the CLI and the steps below and use the `AzureDefaultCredential` (what I use in my example). When deploying this application in Azure, it's recommended to use a managed identity for authentication.
44
+
45
+
### Azure OpenAI Keys
46
+
47
+
It's best to avoid using the Azure OpenAI instances' keys as that could a) accidentally leave credentials in your source code, and b) the keys are different for each instance, requiring maintenance, environment-specific keys, key rotations, etc. However, if you need to use keys, it is possible to set them for each Azure OpenAI backend.
48
+
49
+
When a backend's `api_key` property is set, the `api-key` header will be replaced with the `<api_key>` value prior to sending the request to the corresponding Azure OpenAI instance.
42
50
43
51
## Getting Started
44
52
@@ -51,9 +59,12 @@ Locally, you can log into Azure via the CLI and the steps below and use the `Azu
51
59
52
60
### Configuration
53
61
62
+
Execute the following git command to ensure that updates to `config.py` are not tracked and therefore not committed. This prevents accidental check-ins of real keys and values:
63
+
`git update-index --assume-unchanged config.py`
64
+
54
65
For the load-balanced approach, please use the same model across all instances.
55
66
56
-
1. Open [aoai.py](./aoai.py).
67
+
1. Open [config.py](./config.py).
57
68
1. Replace `<your-aoai-model>` with the name of your Azure OpenAI model.
58
69
1. Replace `<your-aoai-instance>` with the primary/single Azure OpenAI instance.
59
70
1. Replace `<your-aoai-instance-1>`, `<your-aoai-instance-2>`, `<your-aoai-instance-3>` with all the Azure OpenAI instances you want to load-balance across. Delete entries you don't need. See [Load Balancer Backend Configuration](#load-balancer-backend-configuration) for details.
While we strongly recommend the use of managed identities, it is possible to use the Azure OpenAI API keys for each respective Azure OpenAI instance. Note that you are solely responsible for the safeguarding and injection of these keys.
0 commit comments