Skip to content

Commit 8043fb0

Browse files
jssmithtconley1428
andauthored
OpenAI Agents Third-Party model providers (#227)
* update for plugins * formatting * reference main branch * cleanup * switch to plugins on the runners * move around samples * update README files * formatting update * formatting * timeout adjustments * litellm model provider * Format --------- Co-authored-by: Tim Conley <[email protected]>
1 parent dc33729 commit 8043fb0

File tree

4 files changed

+131
-0
lines changed

4 files changed

+131
-0
lines changed
Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
# Model Providers Examples
2+
3+
Custom LLM provider integration examples for OpenAI Agents SDK with Temporal workflows.
4+
5+
*Adapted from [OpenAI Agents SDK model providers examples](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers)*
6+
7+
Before running these examples, be sure to review the [prerequisites and background on the integration](../README.md).
8+
9+
## Running the Examples
10+
11+
### Currently Implemented
12+
13+
#### LiteLLM Auto
14+
Uses built-in LiteLLM support to connect to various model providers.
15+
16+
Start the LiteLLM provider worker:
17+
```bash
18+
# Set the required environment variable for your chosen provider
19+
export ANTHROPIC_API_KEY="your_anthropic_api_key" # For Anthropic
20+
21+
uv run openai_agents/model_providers/run_worker_litellm_provider.py
22+
```
23+
24+
Then run the example in a separate terminal:
25+
```bash
26+
uv run openai_agents/model_providers/run_litellm_auto_workflow.py
27+
```
28+
29+
The example uses Anthropic Claude by default but can be modified to use other LiteLLM-supported providers.
30+
31+
Find more LiteLLM providers at: https://docs.litellm.ai/docs/providers
32+
33+
## Not Yet Implemented
34+
35+
- **Custom Example Agent** - Custom OpenAI client integration
36+
- **Custom Example Global** - Global default client configuration
37+
- **Custom Example Provider** - Custom ModelProvider pattern
38+
- **LiteLLM Provider** - Interactive model/API key input
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
import asyncio
2+
3+
from temporalio.client import Client
4+
from temporalio.contrib.openai_agents import OpenAIAgentsPlugin
5+
6+
from openai_agents.model_providers.workflows.litellm_auto_workflow import (
7+
LitellmAutoWorkflow,
8+
)
9+
10+
11+
async def main():
12+
client = await Client.connect(
13+
"localhost:7233",
14+
plugins=[
15+
OpenAIAgentsPlugin(),
16+
],
17+
)
18+
19+
result = await client.execute_workflow(
20+
LitellmAutoWorkflow.run,
21+
"What's the weather in Tokyo?",
22+
id="litellm-auto-workflow-id",
23+
task_queue="openai-agents-model-providers-task-queue",
24+
)
25+
print(f"Result: {result}")
26+
27+
28+
if __name__ == "__main__":
29+
asyncio.run(main())
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
import asyncio
2+
from datetime import timedelta
3+
4+
from agents.extensions.models.litellm_provider import LitellmProvider
5+
from temporalio.client import Client
6+
from temporalio.contrib.openai_agents import ModelActivityParameters, OpenAIAgentsPlugin
7+
from temporalio.worker import Worker
8+
9+
from openai_agents.model_providers.workflows.litellm_auto_workflow import (
10+
LitellmAutoWorkflow,
11+
)
12+
13+
14+
async def main():
15+
# Create client connected to server at the given address
16+
client = await Client.connect(
17+
"localhost:7233",
18+
plugins=[
19+
OpenAIAgentsPlugin(
20+
model_params=ModelActivityParameters(
21+
start_to_close_timeout=timedelta(seconds=30)
22+
),
23+
model_provider=LitellmProvider(),
24+
),
25+
],
26+
)
27+
28+
worker = Worker(
29+
client,
30+
task_queue="openai-agents-model-providers-task-queue",
31+
workflows=[
32+
LitellmAutoWorkflow,
33+
],
34+
)
35+
await worker.run()
36+
37+
38+
if __name__ == "__main__":
39+
asyncio.run(main())
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
from __future__ import annotations
2+
3+
from agents import Agent, Runner, function_tool, set_tracing_disabled
4+
from temporalio import workflow
5+
6+
7+
@workflow.defn
8+
class LitellmAutoWorkflow:
9+
@workflow.run
10+
async def run(self, prompt: str) -> str:
11+
set_tracing_disabled(disabled=True)
12+
13+
@function_tool
14+
def get_weather(city: str):
15+
return f"The weather in {city} is sunny."
16+
17+
agent = Agent(
18+
name="Assistant",
19+
instructions="You only respond in haikus.",
20+
model="anthropic/claude-3-5-sonnet-20240620",
21+
tools=[get_weather],
22+
)
23+
24+
result = await Runner.run(agent, prompt)
25+
return result.final_output

0 commit comments

Comments
 (0)