Skip to content

Commit 7989e0d

Browse files
Merge branch 'openai:main' into max_turn_exception_result
2 parents 387b5eb + 6e078bf commit 7989e0d

File tree

8 files changed

+117
-29
lines changed

8 files changed

+117
-29
lines changed

docs/ja/mcp.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,12 +12,13 @@ Agents SDK は MCP をサポートしており、これにより幅広い MCP
1212

1313
## MCP サーバー
1414

15-
現在、MCP 仕様では使用するトランスポート方式に基づき 2 種類のサーバーが定義されています。
15+
現在、MCP 仕様では使用するトランスポート方式に基づき 3 種類のサーバーが定義されています。
1616

17-
1. **stdio** サーバー: アプリケーションのサブプロセスとして実行されます。ローカルで動かすイメージです。
17+
1. **stdio** サーバー: アプリケーションのサブプロセスとして実行されます。ローカルで動かすイメージです。
1818
2. **HTTP over SSE** サーバー: リモートで動作し、 URL 経由で接続します。
19+
3. **Streamable HTTP** サーバー: MCP 仕様に定義された Streamable HTTP トランスポートを使用してリモートで動作します。
1920

20-
これらのサーバーへは [`MCPServerStdio`][agents.mcp.server.MCPServerStdio][`MCPServerSse`][agents.mcp.server.MCPServerSse] クラスを使用して接続できます。
21+
これらのサーバーへは [`MCPServerStdio`][agents.mcp.server.MCPServerStdio][`MCPServerSse`][agents.mcp.server.MCPServerSse][`MCPServerStreamableHttp`][agents.mcp.server.MCPServerStreamableHttp] クラスを使用して接続できます。
2122

2223
たとえば、[公式 MCP filesystem サーバー](https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem)を利用する場合は次のようになります。
2324

@@ -46,7 +47,7 @@ agent=Agent(
4647

4748
## キャッシュ
4849

49-
エージェントが実行されるたびに、MCP サーバーへ `list_tools()` が呼び出されます。サーバーがリモートの場合は特にレイテンシが発生します。ツール一覧を自動でキャッシュしたい場合は、[`MCPServerStdio`][agents.mcp.server.MCPServerStdio][`MCPServerSse`][agents.mcp.server.MCPServerSse] の両方に `cache_tools_list=True` を渡してください。ツール一覧が変更されないと確信できる場合のみ使用してください。
50+
エージェントが実行されるたびに、MCP サーバーへ `list_tools()` が呼び出されます。サーバーがリモートの場合は特にレイテンシが発生します。ツール一覧を自動でキャッシュしたい場合は、[`MCPServerStdio`][agents.mcp.server.MCPServerStdio][`MCPServerSse`][agents.mcp.server.MCPServerSse][`MCPServerStreamableHttp`][agents.mcp.server.MCPServerStreamableHttp] の各クラスに `cache_tools_list=True` を渡してください。ツール一覧が変更されないと確信できる場合のみ使用してください。
5051

5152
キャッシュを無効化したい場合は、サーバーで `invalidate_tools_cache()` を呼び出します。
5253

docs/ja/tools.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,10 @@ OpenAI は [`OpenAIResponsesModel`][agents.models.openai_responses.OpenAIRespons
1717
- [`WebSearchTool`][agents.tool.WebSearchTool] はエージェントに Web 検索を行わせます。
1818
- [`FileSearchTool`][agents.tool.FileSearchTool] は OpenAI ベクトルストアから情報を取得します。
1919
- [`ComputerTool`][agents.tool.ComputerTool] はコンピュータ操作タスクを自動化します。
20+
- [`CodeInterpreterTool`][agents.tool.CodeInterpreterTool] はサンドボックス環境でコードを実行します。
21+
- [`HostedMCPTool`][agents.tool.HostedMCPTool] はリモート MCP サーバーのツールをモデルから直接利用できるようにします。
22+
- [`ImageGenerationTool`][agents.tool.ImageGenerationTool] はプロンプトから画像を生成します。
23+
- [`LocalShellTool`][agents.tool.LocalShellTool] はローカルマシンでシェルコマンドを実行します。
2024

2125
```python
2226
from agents import Agent, FileSearchTool, Runner, WebSearchTool

docs/mcp.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,9 @@ Currently, the MCP spec defines two kinds of servers, based on the transport mec
1212

1313
1. **stdio** servers run as a subprocess of your application. You can think of them as running "locally".
1414
2. **HTTP over SSE** servers run remotely. You connect to them via a URL.
15+
3. **Streamable HTTP** servers run remotely using the Streamable HTTP transport defined in the MCP spec.
1516

16-
You can use the [`MCPServerStdio`][agents.mcp.server.MCPServerStdio] and [`MCPServerSse`][agents.mcp.server.MCPServerSse] classes to connect to these servers.
17+
You can use the [`MCPServerStdio`][agents.mcp.server.MCPServerStdio], [`MCPServerSse`][agents.mcp.server.MCPServerSse], and [`MCPServerStreamableHttp`][agents.mcp.server.MCPServerStreamableHttp] classes to connect to these servers.
1718

1819
For example, this is how you'd use the [official MCP filesystem server](https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem).
1920

@@ -42,7 +43,7 @@ agent=Agent(
4243

4344
## Caching
4445

45-
Every time an Agent runs, it calls `list_tools()` on the MCP server. This can be a latency hit, especially if the server is a remote server. To automatically cache the list of tools, you can pass `cache_tools_list=True` to both [`MCPServerStdio`][agents.mcp.server.MCPServerStdio] and [`MCPServerSse`][agents.mcp.server.MCPServerSse]. You should only do this if you're certain the tool list will not change.
46+
Every time an Agent runs, it calls `list_tools()` on the MCP server. This can be a latency hit, especially if the server is a remote server. To automatically cache the list of tools, you can pass `cache_tools_list=True` to [`MCPServerStdio`][agents.mcp.server.MCPServerStdio], [`MCPServerSse`][agents.mcp.server.MCPServerSse], and [`MCPServerStreamableHttp`][agents.mcp.server.MCPServerStreamableHttp]. You should only do this if you're certain the tool list will not change.
4647

4748
If you want to invalidate the cache, you can call `invalidate_tools_cache()` on the servers.
4849

docs/tools.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,10 @@ OpenAI offers a few built-in tools when using the [`OpenAIResponsesModel`][agent
1313
- The [`WebSearchTool`][agents.tool.WebSearchTool] lets an agent search the web.
1414
- The [`FileSearchTool`][agents.tool.FileSearchTool] allows retrieving information from your OpenAI Vector Stores.
1515
- The [`ComputerTool`][agents.tool.ComputerTool] allows automating computer use tasks.
16+
- The [`CodeInterpreterTool`][agents.tool.CodeInterpreterTool] lets the LLM execute code in a sandboxed environment.
17+
- The [`HostedMCPTool`][agents.tool.HostedMCPTool] exposes a remote MCP server's tools to the model.
18+
- The [`ImageGenerationTool`][agents.tool.ImageGenerationTool] generates images from a prompt.
19+
- The [`LocalShellTool`][agents.tool.LocalShellTool] runs shell commands on your machine.
1620

1721
```python
1822
from agents import Agent, FileSearchTool, Runner, WebSearchTool

src/agents/extensions/visualization.py

Lines changed: 35 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
from typing import Optional
1+
from __future__ import annotations
22

33
import graphviz # type: ignore
44

@@ -31,7 +31,9 @@ def get_main_graph(agent: Agent) -> str:
3131
return "".join(parts)
3232

3333

34-
def get_all_nodes(agent: Agent, parent: Optional[Agent] = None) -> str:
34+
def get_all_nodes(
35+
agent: Agent, parent: Agent | None = None, visited: set[str] | None = None
36+
) -> str:
3537
"""
3638
Recursively generates the nodes for the given agent and its handoffs in DOT format.
3739
@@ -41,17 +43,23 @@ def get_all_nodes(agent: Agent, parent: Optional[Agent] = None) -> str:
4143
Returns:
4244
str: The DOT format string representing the nodes.
4345
"""
46+
if visited is None:
47+
visited = set()
48+
if agent.name in visited:
49+
return ""
50+
visited.add(agent.name)
51+
4452
parts = []
4553

4654
# Start and end the graph
47-
parts.append(
48-
'"__start__" [label="__start__", shape=ellipse, style=filled, '
49-
"fillcolor=lightblue, width=0.5, height=0.3];"
50-
'"__end__" [label="__end__", shape=ellipse, style=filled, '
51-
"fillcolor=lightblue, width=0.5, height=0.3];"
52-
)
53-
# Ensure parent agent node is colored
5455
if not parent:
56+
parts.append(
57+
'"__start__" [label="__start__", shape=ellipse, style=filled, '
58+
"fillcolor=lightblue, width=0.5, height=0.3];"
59+
'"__end__" [label="__end__", shape=ellipse, style=filled, '
60+
"fillcolor=lightblue, width=0.5, height=0.3];"
61+
)
62+
# Ensure parent agent node is colored
5563
parts.append(
5664
f'"{agent.name}" [label="{agent.name}", shape=box, style=filled, '
5765
"fillcolor=lightyellow, width=1.5, height=0.8];"
@@ -71,17 +79,20 @@ def get_all_nodes(agent: Agent, parent: Optional[Agent] = None) -> str:
7179
f"fillcolor=lightyellow, width=1.5, height=0.8];"
7280
)
7381
if isinstance(handoff, Agent):
74-
parts.append(
75-
f'"{handoff.name}" [label="{handoff.name}", '
76-
f"shape=box, style=filled, style=rounded, "
77-
f"fillcolor=lightyellow, width=1.5, height=0.8];"
78-
)
79-
parts.append(get_all_nodes(handoff))
82+
if handoff.name not in visited:
83+
parts.append(
84+
f'"{handoff.name}" [label="{handoff.name}", '
85+
f"shape=box, style=filled, style=rounded, "
86+
f"fillcolor=lightyellow, width=1.5, height=0.8];"
87+
)
88+
parts.append(get_all_nodes(handoff, agent, visited))
8089

8190
return "".join(parts)
8291

8392

84-
def get_all_edges(agent: Agent, parent: Optional[Agent] = None) -> str:
93+
def get_all_edges(
94+
agent: Agent, parent: Agent | None = None, visited: set[str] | None = None
95+
) -> str:
8596
"""
8697
Recursively generates the edges for the given agent and its handoffs in DOT format.
8798
@@ -92,6 +103,12 @@ def get_all_edges(agent: Agent, parent: Optional[Agent] = None) -> str:
92103
Returns:
93104
str: The DOT format string representing the edges.
94105
"""
106+
if visited is None:
107+
visited = set()
108+
if agent.name in visited:
109+
return ""
110+
visited.add(agent.name)
111+
95112
parts = []
96113

97114
if not parent:
@@ -109,15 +126,15 @@ def get_all_edges(agent: Agent, parent: Optional[Agent] = None) -> str:
109126
if isinstance(handoff, Agent):
110127
parts.append(f"""
111128
"{agent.name}" -> "{handoff.name}";""")
112-
parts.append(get_all_edges(handoff, agent))
129+
parts.append(get_all_edges(handoff, agent, visited))
113130

114131
if not agent.handoffs and not isinstance(agent, Tool): # type: ignore
115132
parts.append(f'"{agent.name}" -> "__end__";')
116133

117134
return "".join(parts)
118135

119136

120-
def draw_graph(agent: Agent, filename: Optional[str] = None) -> graphviz.Source:
137+
def draw_graph(agent: Agent, filename: str | None = None) -> graphviz.Source:
121138
"""
122139
Draws the graph for the given agent and optionally saves it as a PNG file.
123140

src/agents/models/openai_chatcompletions.py

Lines changed: 17 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -71,12 +71,22 @@ async def get_response(
7171
stream=False,
7272
)
7373

74+
first_choice = response.choices[0]
75+
message = first_choice.message
76+
7477
if _debug.DONT_LOG_MODEL_DATA:
7578
logger.debug("Received model response")
7679
else:
77-
logger.debug(
78-
f"LLM resp:\n{json.dumps(response.choices[0].message.model_dump(), indent=2)}\n"
79-
)
80+
if message is not None:
81+
logger.debug(
82+
"LLM resp:\n%s\n",
83+
json.dumps(message.model_dump(), indent=2),
84+
)
85+
else:
86+
logger.debug(
87+
"LLM resp had no message. finish_reason: %s",
88+
first_choice.finish_reason,
89+
)
8090

8191
usage = (
8292
Usage(
@@ -101,13 +111,15 @@ async def get_response(
101111
else Usage()
102112
)
103113
if tracing.include_data():
104-
span_generation.span_data.output = [response.choices[0].message.model_dump()]
114+
span_generation.span_data.output = (
115+
[message.model_dump()] if message is not None else []
116+
)
105117
span_generation.span_data.usage = {
106118
"input_tokens": usage.input_tokens,
107119
"output_tokens": usage.output_tokens,
108120
}
109121

110-
items = Converter.message_to_output_items(response.choices[0].message)
122+
items = Converter.message_to_output_items(message) if message is not None else []
111123

112124
return ModelResponse(
113125
output=items,

tests/test_openai_chatcompletions.py

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -191,6 +191,40 @@ async def patched_fetch_response(self, *args, **kwargs):
191191
assert fn_call_item.arguments == "{'x':1}"
192192

193193

194+
@pytest.mark.allow_call_model_methods
195+
@pytest.mark.asyncio
196+
async def test_get_response_with_no_message(monkeypatch) -> None:
197+
"""If the model returns no message, get_response should return an empty output."""
198+
msg = ChatCompletionMessage(role="assistant", content="ignored")
199+
choice = Choice(index=0, finish_reason="content_filter", message=msg)
200+
choice.message = None # type: ignore[assignment]
201+
chat = ChatCompletion(
202+
id="resp-id",
203+
created=0,
204+
model="fake",
205+
object="chat.completion",
206+
choices=[choice],
207+
usage=None,
208+
)
209+
210+
async def patched_fetch_response(self, *args, **kwargs):
211+
return chat
212+
213+
monkeypatch.setattr(OpenAIChatCompletionsModel, "_fetch_response", patched_fetch_response)
214+
model = OpenAIProvider(use_responses=False).get_model("gpt-4")
215+
resp: ModelResponse = await model.get_response(
216+
system_instructions=None,
217+
input="",
218+
model_settings=ModelSettings(),
219+
tools=[],
220+
output_schema=None,
221+
handoffs=[],
222+
tracing=ModelTracing.DISABLED,
223+
previous_response_id=None,
224+
)
225+
assert resp.output == []
226+
227+
194228
@pytest.mark.asyncio
195229
async def test_fetch_response_non_stream(monkeypatch) -> None:
196230
"""

tests/test_visualization.py

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -134,3 +134,18 @@ def test_draw_graph(mock_agent):
134134
'"Handoff1" [label="Handoff1", shape=box, style=filled, style=rounded, '
135135
"fillcolor=lightyellow, width=1.5, height=0.8];" in graph.source
136136
)
137+
138+
139+
def test_cycle_detection():
140+
agent_a = Agent(name="A")
141+
agent_b = Agent(name="B")
142+
agent_a.handoffs.append(agent_b)
143+
agent_b.handoffs.append(agent_a)
144+
145+
nodes = get_all_nodes(agent_a)
146+
edges = get_all_edges(agent_a)
147+
148+
assert nodes.count('"A" [label="A"') == 1
149+
assert nodes.count('"B" [label="B"') == 1
150+
assert '"A" -> "B"' in edges
151+
assert '"B" -> "A"' in edges

0 commit comments

Comments
 (0)