Skip to content

Commit 974f595

Browse files
committed
First draft of docs
1 parent 4bf1ce2 commit 974f595

File tree

1 file changed

+34
-5
lines changed

1 file changed

+34
-5
lines changed

docs/dbos.md

Lines changed: 34 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ async def main():
108108
#> Mexico City (Ciudad de México, CDMX)
109109
```
110110

111-
1. The original `Agent` cannot be used inside a deterministic DBOS workflow, but the `DBOSAgent` can. Workflow function declarations and `DBOSAgent` creations needs to happen before calling `DBOS.launch()` because DBOS requires all workflows to be registered before launch so that recovery can correctly find all workflows.
111+
1. The original `Agent` cannot be used inside a deterministic DBOS workflow, but the `DBOSAgent` can. Workflow function declarations and `DBOSAgent` creations need to happen before calling `DBOS.launch()` because DBOS requires all workflows to be registered before launch so that recovery can correctly find all workflows.
112112
2. [`DBOSAgent.run()`][pydantic_ai.durable_exec.dbos.DBOSAgent.run] works like [`Agent.run()`][pydantic_ai.Agent.run], but runs inside a DBOS workflow and wraps model requests, decorated tool calls, and MCP communication as DBOS steps.
113113
3. This assumes DBOS is using SQLite. To deploy your agent to production, we recommend using a Postgres server.
114114
4. The agent's `name` is used to uniquely identify its workflows.
@@ -125,13 +125,13 @@ There are a few considerations specific to agents and toolsets when using DBOS f
125125

126126
### Agent and Toolset Requirements
127127

128-
To ensure that DBOS knows what code to run when a workflow fails or is interrupted and then restarted, each agent instance needs to have a name that's unique.
128+
To ensure that DBOS knows what code to run when a workflow fails or is interrupted and then restarted, each agent instance needs to have a unique name.
129129

130130
Other than that, any agent and toolset will just work!
131131

132132
### Agent Run Context and Dependencies
133133

134-
As DBOS checkpoints workflows and steps execution into a database, workflow inputs and outputs, and step outputs need to be serializable (JSON Pickleable). You may also want to keep the inputs and outputs small (usually less than 2MB).
134+
As DBOS checkpoints workflow and step execution into a database, workflow inputs and outputs, and step outputs need to be serializable (JSON pickleable). You may also want to keep the inputs and outputs small (the maximum size for a single field in PostgreSQL is 1 GB, but usually you want to keep the output size under 2 MB).
135135

136136
### Streaming
137137

@@ -143,8 +143,37 @@ The event stream handler function will receive the agent [run context][pydantic_
143143

144144
## Step Configuration
145145

146-
TBD
146+
DBOS step configuration, like retry policies, can be customized by passing [`StepConfig`][pydantic_ai.durable_exec.dbos.StepConfig] objects to the `DBOSAgent` constructor:
147+
148+
- `mcp_step_config`: The DBOS step config to use for MCP server communication. If no config is provided, it disables DBOS step retries.
149+
- `model_step_config`: The DBOS step config to use for model request steps. If no config is provided, it disables DBOS step retries.
150+
151+
For individual tools, you can annotate them with [`@DBOS.step`](https://docs.dbos.dev/python/reference/decorators#step) or [`@DBOS.workflow`](https://docs.dbos.dev/python/reference/decorators#workflow) decorators as needed. Decorated steps are just normal functions if called outside of DBOS workflows, which can be used in non-DBOS agents.
147152

148153
## Step Retries
149154

150-
TBD
155+
On top of the automatic retries for request failures that DBOS will perform, Pydantic AI and various provider API clients also have their own request retry logic. Enabling these at the same time may cause the request to be retried more often than expected, with improper `Retry-After` handling.
156+
157+
When using DBOS, it's recommended to not use [HTTP Request Retries](retries.md) and to turn off your provider API client's own retry logic, for example by setting `max_retries=0` on a [custom `OpenAIProvider` API client](models/openai.md#custom-openai-client).
158+
159+
You can customize DBOS's retry policy using [step configuration](#step-configuration).
160+
161+
## Observability with Logfire
162+
163+
DBOS generates OpenTelemetry traces and events for each workflow and step execution, and Pydantic AI generates events for each agent run, model request and tool call. These can be sent to [Pydantic Logfire](logfire.md) to get a complete picture of what's happening in your application.
164+
165+
To disable sending DBOS traces to Logfire, you can pass `disable_otlp=True` to the `DBOS` constructor. For example:
166+
167+
168+
```python {title="dbos_no_traces.py" test="skip"}
169+
from dbos import DBOS, DBOSConfig
170+
171+
dbos_config: DBOSConfig = {
172+
'name': 'pydantic_dbos_agent',
173+
'system_database_url': 'sqlite:///dbostest.sqlite',
174+
'disable_otlp': True # (1)!
175+
}
176+
DBOS(config=dbos_config)
177+
```
178+
179+
1. If `True`, disables OpenTelemetry tracing and logging for DBOS. Defaults to `False`.

0 commit comments

Comments
 (0)