Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.1.0-alpha.5"
".": "0.1.0-alpha.6"
}
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 77
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/digitalocean%2Fgradientai-e8b3cbc80e18e4f7f277010349f25e1319156704f359911dc464cc21a0d077a6.yml
openapi_spec_hash: c773d792724f5647ae25a5ae4ccec208
config_hash: ecf128ea21a8fead9dabb9609c4dbce8
config_hash: 9c2519464cf5de240e34bd89b9f65706
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
# Changelog

## 0.1.0-alpha.6 (2025-06-27)

Full Changelog: [v0.1.0-alpha.5...v0.1.0-alpha.6](https://github.com/digitalocean/gradientai-python/compare/v0.1.0-alpha.5...v0.1.0-alpha.6)

### Features

* **api:** manual updates ([04eb1be](https://github.com/digitalocean/gradientai-python/commit/04eb1be35de7db04e1f0d4e1da8719b54a353bb5))

## 0.1.0-alpha.5 (2025-06-27)

Full Changelog: [v0.1.0-alpha.4...v0.1.0-alpha.5](https://github.com/digitalocean/gradientai-python/compare/v0.1.0-alpha.4...v0.1.0-alpha.5)
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ $ pip install -r requirements-dev.lock

Most of the SDK is generated code. Modifications to code will be persisted between generations, but may
result in merge conflicts between manual patches and changes from the generator. The generator will never
modify the contents of the `src/do_gradientai/lib/` and `examples/` directories.
modify the contents of the `src/gradientai/lib/` and `examples/` directories.

## Adding and running examples

Expand Down
44 changes: 22 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ The full API of this library can be found in [api.md](api.md).

```python
import os
from do_gradientai import GradientAI
from gradientai import GradientAI

client = GradientAI(
api_key=os.environ.get("GRADIENTAI_API_KEY"), # This is the default and can be omitted
Expand Down Expand Up @@ -55,7 +55,7 @@ Simply import `AsyncGradientAI` instead of `GradientAI` and use `await` with eac
```python
import os
import asyncio
from do_gradientai import AsyncGradientAI
from gradientai import AsyncGradientAI

client = AsyncGradientAI(
api_key=os.environ.get("GRADIENTAI_API_KEY"), # This is the default and can be omitted
Expand Down Expand Up @@ -96,8 +96,8 @@ Then you can enable it by instantiating the client with `http_client=DefaultAioH
```python
import os
import asyncio
from do_gradientai import DefaultAioHttpClient
from do_gradientai import AsyncGradientAI
from gradientai import DefaultAioHttpClient
from gradientai import AsyncGradientAI


async def main() -> None:
Expand Down Expand Up @@ -134,7 +134,7 @@ Typed requests and responses provide autocomplete and documentation within your
Nested parameters are dictionaries, typed using `TypedDict`, for example:

```python
from do_gradientai import GradientAI
from gradientai import GradientAI

client = GradientAI()

Expand All @@ -153,29 +153,29 @@ print(completion.stream_options)

## Handling errors

When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `do_gradientai.APIConnectionError` is raised.
When the library is unable to connect to the API (for example, due to network connection problems or a timeout), a subclass of `gradientai.APIConnectionError` is raised.

When the API returns a non-success status code (that is, 4xx or 5xx
response), a subclass of `do_gradientai.APIStatusError` is raised, containing `status_code` and `response` properties.
response), a subclass of `gradientai.APIStatusError` is raised, containing `status_code` and `response` properties.

All errors inherit from `do_gradientai.APIError`.
All errors inherit from `gradientai.APIError`.

```python
import do_gradientai
from do_gradientai import GradientAI
import gradientai
from gradientai import GradientAI

client = GradientAI()

try:
client.agents.versions.list(
uuid="REPLACE_ME",
)
except do_gradientai.APIConnectionError as e:
except gradientai.APIConnectionError as e:
print("The server could not be reached")
print(e.__cause__) # an underlying Exception, likely raised within httpx.
except do_gradientai.RateLimitError as e:
except gradientai.RateLimitError as e:
print("A 429 status code was received; we should back off a bit.")
except do_gradientai.APIStatusError as e:
except gradientai.APIStatusError as e:
print("Another non-200-range status code was received")
print(e.status_code)
print(e.response)
Expand Down Expand Up @@ -203,7 +203,7 @@ Connection errors (for example, due to a network connectivity problem), 408 Requ
You can use the `max_retries` option to configure or disable retry settings:

```python
from do_gradientai import GradientAI
from gradientai import GradientAI

# Configure the default for all requests:
client = GradientAI(
Expand All @@ -223,7 +223,7 @@ By default requests time out after 1 minute. You can configure this with a `time
which accepts a float or an [`httpx.Timeout`](https://www.python-httpx.org/advanced/timeouts/#fine-tuning-the-configuration) object:

```python
from do_gradientai import GradientAI
from gradientai import GradientAI

# Configure the default for all requests:
client = GradientAI(
Expand Down Expand Up @@ -277,7 +277,7 @@ if response.my_field is None:
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,

```py
from do_gradientai import GradientAI
from gradientai import GradientAI

client = GradientAI()
response = client.agents.versions.with_raw_response.list(
Expand All @@ -289,9 +289,9 @@ version = response.parse() # get the object that `agents.versions.list()` would
print(version.agent_versions)
```

These methods return an [`APIResponse`](https://github.com/digitalocean/gradientai-python/tree/main/src/do_gradientai/_response.py) object.
These methods return an [`APIResponse`](https://github.com/digitalocean/gradientai-python/tree/main/src/gradientai/_response.py) object.

The async client returns an [`AsyncAPIResponse`](https://github.com/digitalocean/gradientai-python/tree/main/src/do_gradientai/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.
The async client returns an [`AsyncAPIResponse`](https://github.com/digitalocean/gradientai-python/tree/main/src/gradientai/_response.py) with the same structure, the only difference being `await`able methods for reading the response content.

#### `.with_streaming_response`

Expand Down Expand Up @@ -355,7 +355,7 @@ You can directly override the [httpx client](https://www.python-httpx.org/api/#c

```python
import httpx
from do_gradientai import GradientAI, DefaultHttpxClient
from gradientai import GradientAI, DefaultHttpxClient

client = GradientAI(
# Or use the `GRADIENT_AI_BASE_URL` env var
Expand All @@ -378,7 +378,7 @@ client.with_options(http_client=DefaultHttpxClient(...))
By default the library closes underlying HTTP connections whenever the client is [garbage collected](https://docs.python.org/3/reference/datamodel.html#object.__del__). You can manually close the client using the `.close()` method if desired, or with a context manager that closes when exiting.

```py
from do_gradientai import GradientAI
from gradientai import GradientAI

with GradientAI() as client:
# make requests here
Expand Down Expand Up @@ -406,8 +406,8 @@ If you've upgraded to the latest version but aren't seeing any new features you
You can determine the version that is being used at runtime with:

```py
import do_gradientai
print(do_gradientai.__version__)
import gradientai
print(gradientai.__version__)
```

## Requirements
Expand Down
Loading