You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Alternatively, you can build from source and install the wheel file:
@@ -120,7 +120,7 @@ the changes aren't made through the automated pipeline, you may want to make rel
120
120
121
121
### Publish with a GitHub workflow
122
122
123
-
You can release to package managers by using [the `Publish PyPI` GitHub action](https://www.github.com/stainless-sdks/serverless-inference-sdk-prod-python/actions/workflows/publish-pypi.yml). This requires a setup organization or repository secret to be set up.
123
+
You can release to package managers by using [the `Publish PyPI` GitHub action](https://www.github.com/digitalocean/genai-python/actions/workflows/publish-pypi.yml). This requires a setup organization or repository secret to be set up.
@@ -244,9 +244,9 @@ assistant = response.parse() # get the object that `assistants.list()` would ha
244
244
print(assistant.first_id)
245
245
```
246
246
247
-
These methods return an [`APIResponse`](https://github.com/stainless-sdks/serverless-inference-sdk-prod-python/tree/main/src/serverless_inference_sdk_prod/_response.py) object.
247
+
These methods return an [`APIResponse`](https://github.com/digitalocean/genai-python/tree/main/src/serverless_inference_sdk_prod/_response.py) object.
248
248
249
-
The async client returns an [`AsyncAPIResponse`](https://github.com/stainless-sdks/serverless-inference-sdk-prod-python/tree/main/src/serverless_inference_sdk_prod/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
249
+
The async client returns an [`AsyncAPIResponse`](https://github.com/digitalocean/genai-python/tree/main/src/serverless_inference_sdk_prod/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
250
250
251
251
#### `.with_streaming_response`
252
252
@@ -350,7 +350,7 @@ This package generally follows [SemVer](https://semver.org/spec/v2.0.0.html) con
350
350
351
351
We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.
352
352
353
-
We are keen for your feedback; please open an [issue](https://www.github.com/stainless-sdks/serverless-inference-sdk-prod-python/issues) with questions, bugs, or suggestions.
353
+
We are keen for your feedback; please open an [issue](https://www.github.com/digitalocean/genai-python/issues) with questions, bugs, or suggestions.
errors+=("The SERVERLESS_INFERENCE_SDK_PROD_PYPI_TOKEN secret has not been set. Please set it in either this repository's secrets or your organization secrets.")
7
+
fi
8
+
9
+
lenErrors=${#errors[@]}
10
+
11
+
if [[ lenErrors -gt 0 ]];then
12
+
echo -e "Found the following errors in the release environment:\n"
prefix=f"Expected entry at `{key}`"ifkeyisnotNoneelsef"Expected file input `{obj!r}`"
36
36
raiseRuntimeError(
37
-
f"{prefix} to be bytes, an io.IOBase instance, PathLike or a tuple but received {type(obj)} instead. See https://github.com/stainless-sdks/serverless-inference-sdk-prod-python/tree/main#file-uploads"
37
+
f"{prefix} to be bytes, an io.IOBase instance, PathLike or a tuple but received {type(obj)} instead. See https://github.com/digitalocean/genai-python/tree/main#file-uploads"
0 commit comments