-
Notifications
You must be signed in to change notification settings - Fork 174
Open
Labels
bugSomething isn't workingSomething isn't working
Description
In order to get token counts and costs for autogen-agentchat apps, we ask users to use also add instrumentation for the underlying LLM client library, e.g., openinference-instrumentation-openai. There is an issue, however, because when streaming, the underlying library for OpenAI relies on the user passing a parameter to return a usage payload containing token counts, without which our instrumentation doesn't know tokens. The autogen-agentchat LLM wrapper doesn't pass this parameter, so users who want to use autogen-agentchat LLMs with streaming cannot get tokens and cost info.
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
import asyncio
from phoenix.otel import register
import os
import openai
tracer_provider = register(project_name="autogen-test", auto_instrument=True)
model_client = OpenAIChatCompletionClient(model="gpt-4o-mini", api_key=os.getenv("OPENAI_API_KEY"))
async def get_weather(city: str) -> str:
"""Get the weather for a given city."""
return f"The weather in {city} is 73 degrees and Sunny."
agent = AssistantAgent(
name="weather_agent",
model_client=model_client,
tools=[get_weather],
system_message="You are a helpful assistant.",
reflect_on_tool_use=True,
model_client_stream=True, # Enable streaming tokens from the model client.
)
# Run the agent and stream the messages to the console.
async def main() -> None:
await Console(agent.run_stream(task="What is the weather in New York?"))
# Close the connection to the model client.
await model_client.close()
asyncio.run(main())Possible solutions include making a PR into autogen-agentchat to update the invocation of the OpenAI client, or just documenting the issue and fix.
dosubot
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working
Type
Projects
Status
No status