Skip to content

Commit 1e7a01a

Browse files
committed
deploy oai sdk integration
1 parent 10f2de4 commit 1e7a01a

File tree

2 files changed

+157
-0
lines changed

2 files changed

+157
-0
lines changed
Lines changed: 65 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,65 @@
1+
# Integrating Perplexity’s Sonar API with the OpenAI SDK
2+
3+
This guide demonstrates how to integrate Perplexity’s Sonar API into the OpenAI SDK using a custom asynchronous client. The example shows how to create an agent that leverages the custom LLM provider, set up function tools, and run a sample query. This is ideal if you want to extend or customize your integration of the Sonar API for various applications.
4+
5+
## Table of Contents
6+
• Overview
7+
• Prerequisites
8+
• Environment Setup
9+
• Usage
10+
• Code Walkthrough
11+
• Running the Example
12+
• References
13+
14+
## Overview
15+
16+
This example involves:
17+
• Creating an asynchronous OpenAI client configured with Perplexity’s Sonar API.
18+
• Creating a custom model that uses this client.
19+
• Configuring an agent with custom instructions and a function tool to fetch weather.
20+
• Running a sample query to demonstrate integration.
21+
22+
## Prerequisites
23+
• Python 3.7 or later.
24+
• Required Python packages:
25+
• openai
26+
• nest_asyncio
27+
• The custom agents package (ensure this is installed or available in your environment).
28+
29+
Install dependencies via pip if needed:
30+
31+
```bash
32+
pip install openai nest_asyncio
33+
```
34+
35+
## Environment Setup
36+
37+
Set the following environment variables or update the code with the appropriate values:
38+
`EXAMPLE_BASE_URL`: The base URL of the Perplexity API (default: https://api.perplexity.ai).
39+
`EXAMPLE_API_KEY`: Your API key for accessing the Sonar API.
40+
`EXAMPLE_MODEL_NAME`: The model name, defaulting to `sonar-pro`.
41+
42+
## Usage
43+
44+
Simply run the script to create the agent and test a sample query asking for the weather in Tokyo. The agent will execute the query using the custom Sonar API-powered model and a tool to fetch weather data.
45+
46+
47+
## Running the Example
48+
1. Set up Environment Variables:
49+
Ensure `EXAMPLE_BASE_URL`, `EXAMPLE_API_KEY`, and `EXAMPLE_MODEL_NAME` are set, either in your shell or within the code.
50+
2. Install Dependencies:
51+
Make sure you have installed the required packages (`openai`, `nest_asyncio`, and `agents`).
52+
3. Execute the Script:
53+
Run the script using Python:
54+
55+
```bash
56+
python your_script_name.py
57+
```
58+
59+
## Conclusion
60+
61+
This example provides a hands-on guide for integrating Perplexity’s Sonar API with the OpenAI SDK. The approach involves creating a custom asynchronous client, setting up a model with our API, and using an agent to process queries with additional function tools. Customize this code as needed to fit your application’s requirements.
62+
63+
References
64+
[Perplexity Sonar API Documentation](https://docs.perplexity.ai/home)
65+
[OpenAI Agents SDK Documentation](https://github.com/openai/openai-agents-python/blob/main/examples/model_providers/custom_example_agent.py)
Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
# Import necessary standard libraries
2+
import asyncio # For running asynchronous code
3+
import os # To access environment variables
4+
5+
# Import AsyncOpenAI for creating an async client
6+
from openai import AsyncOpenAI
7+
8+
# Import custom classes and functions from the agents package.
9+
# These handle agent creation, model interfacing, running agents, and more.
10+
from agents import Agent, OpenAIChatCompletionsModel, Runner, function_tool, set_tracing_disabled
11+
12+
# Retrieve configuration from environment variables or use defaults
13+
BASE_URL = os.getenv("EXAMPLE_BASE_URL") or "https://api.perplexity.ai"
14+
API_KEY = os.getenv("EXAMPLE_API_KEY")
15+
MODEL_NAME = os.getenv("EXAMPLE_MODEL_NAME") or "sonar-pro"
16+
17+
# Validate that all required configuration variables are set
18+
if not BASE_URL or not API_KEY or not MODEL_NAME:
19+
raise ValueError(
20+
"Please set EXAMPLE_BASE_URL, EXAMPLE_API_KEY, EXAMPLE_MODEL_NAME via env var or code."
21+
)
22+
23+
"""
24+
This example illustrates how to use a custom provider with a specific agent:
25+
1. We create an asynchronous OpenAI client configured to interact with the Perplexity Sonar API.
26+
2. We define a custom model using this client.
27+
3. We set up an Agent with our custom model and attach function tools.
28+
Note: Tracing is disabled in this example. If you have an OpenAI platform API key,
29+
you can enable tracing by setting the environment variable OPENAI_API_KEY or using set_tracing_export_api_key().
30+
"""
31+
32+
# Initialize the custom OpenAI async client with the specified BASE_URL and API_KEY.
33+
client = AsyncOpenAI(base_url=BASE_URL, api_key=API_KEY)
34+
35+
# Disable tracing to avoid using a platform tracing key; adjust as needed.
36+
set_tracing_disabled(disabled=True)
37+
38+
# (Alternate approach example, commented out)
39+
# PROVIDER = OpenAIProvider(openai_client=client)
40+
# agent = Agent(..., model="some-custom-model")
41+
# Runner.run(agent, ..., run_config=RunConfig(model_provider=PROVIDER))
42+
43+
# Define a function tool that the agent can call.
44+
# The decorator registers this function as a tool in the agents framework.
45+
@function_tool
46+
def get_weather(city: str):
47+
"""
48+
Simulate fetching weather data for a given city.
49+
50+
Args:
51+
city (str): The name of the city to retrieve weather for.
52+
53+
Returns:
54+
str: A message with weather information.
55+
"""
56+
print(f"[debug] getting weather for {city}")
57+
return f"The weather in {city} is sunny."
58+
59+
# Import nest_asyncio to support nested event loops (helpful in interactive environments like Jupyter)
60+
import nest_asyncio
61+
62+
# Apply the nest_asyncio patch to enable running asyncio.run() even if an event loop is already running.
63+
nest_asyncio.apply()
64+
65+
async def main():
66+
"""
67+
Main asynchronous function to set up and run the agent.
68+
69+
This function creates an Agent with a custom model and function tools,
70+
then runs a query to get the weather in Tokyo.
71+
"""
72+
# Create an Agent instance with:
73+
# - A name ("Assistant")
74+
# - Custom instructions ("Be precise and concise.")
75+
# - A model built from OpenAIChatCompletionsModel using our client and model name.
76+
# - A list of tools; here, only get_weather is provided.
77+
agent = Agent(
78+
name="Assistant",
79+
instructions="Be precise and concise.",
80+
model=OpenAIChatCompletionsModel(model=MODEL_NAME, openai_client=client),
81+
tools=[get_weather],
82+
)
83+
84+
# Execute the agent with the sample query.
85+
result = await Runner.run(agent, "What's the weather in Tokyo?")
86+
87+
# Print the final output from the agent.
88+
print(result.final_output)
89+
90+
# Standard boilerplate to run the async main() function.
91+
if __name__ == "__main__":
92+
asyncio.run(main())

0 commit comments

Comments
 (0)