Description
FoundryChatClient sample uses async with but client is not an async context manager
Summary
Running the "built-in chat clients" sample with client_name="foundry_chat" fails with:
TypeError: FoundryChatClient object does not support the asynchronous context manager protocol (missed __aexit__ method)
The sample wraps the Foundry client in async with client:, but the installed FoundryChatClient implementation does not implement the async context manager protocol (__aenter__ / __aexit__), so execution fails.
Code Sample
#working solution
import asyncio
import os
from random import randint
from typing import Annotated, Any, Literal
from agent_framework import Agent, Message, SupportsChatGetResponse, tool
from agent_framework.foundry import FoundryChatClient
from agent_framework.openai import OpenAIChatClient, OpenAIChatCompletionClient
from azure.identity import AzureCliCredential
from dotenv import load_dotenv
from pydantic import Field
# Load environment variables from .env file
load_dotenv()
"""
Built-in Chat Clients Example
This sample demonstrates how to run the same prompt flow against different built-in
chat clients using a single `get_client` factory.
Select one of these client names:
- openai_chat
- openai_chat_completion
- anthropic
- ollama
- bedrock
- azure_openai_chat
- azure_openai_chat_completion
- foundry_chat
"""
ClientName = Literal[
"openai_chat",
"openai_chat_completion",
"anthropic",
"ollama",
"bedrock",
"azure_openai_chat",
"azure_openai_chat_completion",
"foundry_chat",
]
# NOTE: approval_mode="never_require" is for sample brevity.
@tool(approval_mode="never_require")
def get_weather(
location: Annotated[str, Field(description="The location to get the weather for.")],
) -> str:
"""Get the weather for a given location."""
conditions = ["sunny", "cloudy", "rainy", "stormy"]
return f"The weather in {location} is {conditions[randint(0, 3)]} with a high of {randint(10, 30)}°C."
def get_client(client_name: ClientName) -> SupportsChatGetResponse[Any]:
"""Create a built-in chat client from a name."""
from agent_framework.amazon import BedrockChatClient
from agent_framework.anthropic import AnthropicClient
from agent_framework.ollama import OllamaChatClient
if client_name == "openai_chat":
return OpenAIChatClient()
if client_name == "openai_chat_completion":
return OpenAIChatCompletionClient()
if client_name == "anthropic":
return AnthropicClient()
if client_name == "ollama":
return OllamaChatClient()
if client_name == "bedrock":
return BedrockChatClient()
if client_name == "azure_openai_chat":
return OpenAIChatClient(credential=AzureCliCredential())
if client_name == "azure_openai_chat_completion":
return OpenAIChatCompletionClient(credential=AzureCliCredential())
if client_name == "foundry_chat":
return FoundryChatClient(
project_endpoint=os.environ["FOUNDRY_PROJECT_ENDPOINT"],
model=os.environ["FOUNDRY_MODEL"],
credential=AzureCliCredential(),
)
raise ValueError(f"Unsupported client name: {client_name}")
async def main(client_name: ClientName = "openai_chat") -> None:
"""Run a basic prompt using a selected built-in client."""
client = get_client(client_name)
message = Message("user", contents=["What's the weather in Amsterdam and in Paris?"])
stream = os.getenv("STREAM", "false").lower() == "true"
print(f"Client: {client_name}")
print(f"User: {message.text}")
if isinstance(client, FoundryChatClient):
agent = Agent(
client=client,
instructions="You are a helpful assistant.",
tools=get_weather,
)
if stream:
print("Assistant: ", end="")
async for chunk in agent.run(message.text, stream=True):
if chunk.text:
print(chunk.text, end="")
print("")
else:
print(f"Assistant: {await agent.run(message.text)}")
return
if stream:
response_stream = client.get_response([message], stream=True, options={"tools": get_weather})
print("Assistant: ", end="")
async for chunk in response_stream:
if chunk.text:
print(chunk.text, end="")
print("")
else:
print(f"Assistant: {await client.get_response([message], stream=False, options={'tools': get_weather})}")
if __name__ == "__main__":
asyncio.run(main("foundry_chat"))
Error Messages / Stack Traces
Client: foundry_chat
User: What's the weather in Amsterdam and in Paris?
Traceback (most recent call last):
File "/Users/damian/Projects/personal/agent-framework-test/build_in_chat_clients.py", line 123, in <module>
asyncio.run(main("foundry_chat"))
~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/python@3.14/3.14.3_1/Frameworks/Python.framework/Versions/3.14/lib/python3.14/asyncio/runners.py", line 204, in run
return runner.run(main)
~~~~~~~~~~^^^^^^
File "/opt/homebrew/Cellar/python@3.14/3.14.3_1/Frameworks/Python.framework/Versions/3.14/lib/python3.14/asyncio/runners.py", line 127, in run
return self._loop.run_until_complete(task)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
File "/opt/homebrew/Cellar/python@3.14/3.14.3_1/Frameworks/Python.framework/Versions/3.14/lib/python3.14/asyncio/base_events.py", line 719, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/Users/damian/Projects/personal/agent-framework-test/build_in_chat_clients.py", line 97, in main
async with client:
^^^^^^
TypeError: 'agent_framework_foundry._chat_client.FoundryChatClient' object does not support the asynchronous context manager protocol (missed __aexit__ method)
Package Versions
agent-framework-core Version: 1.0.1; agent-framework-foundry Version: 1.0.1
Python Version
Python: 3.14.3 (Homebrew)
Additional Context
The official Foundry sample 02-agents/built_in_chat_clients.py in this repo uses FoundryChatClient without async with client:; it wraps the client in an Agent and calls await agent.run(...). This suggests that FoundryChatClient may not be intended to be used as an async context manager, making the built-in clients sample inconsistent.
Workaround
Remove the async with client: block for Foundry clients and call await client.get_response(...) directly, OR follow the pattern used in built_in_chat_clients.py (wrap in Agent and call await agent.run(...)).
Environment
OS: macOS (Apple Silicon)
Azure Functions Core Tools: 4.9.0 (not required to reproduce this specific error)
Auth: AzureCliCredential (az login)
Description
FoundryChatClient sample uses
async withbut client is not an async context managerSummary
Running the "built-in chat clients" sample with
client_name="foundry_chat"fails with:TypeError: FoundryChatClient object does not support the asynchronous context manager protocol (missed __aexit__ method)The sample wraps the Foundry client in
async with client:, but the installedFoundryChatClientimplementation does not implement the async context manager protocol (__aenter__/__aexit__), so execution fails.Code Sample
Error Messages / Stack Traces
Package Versions
agent-framework-core Version: 1.0.1; agent-framework-foundry Version: 1.0.1
Python Version
Python: 3.14.3 (Homebrew)
Additional Context
The official Foundry sample 02-agents/built_in_chat_clients.py in this repo uses FoundryChatClient without async with client:; it wraps the client in an Agent and calls await agent.run(...). This suggests that FoundryChatClient may not be intended to be used as an async context manager, making the built-in clients sample inconsistent.
Workaround
Remove the async with client: block for Foundry clients and call await client.get_response(...) directly, OR follow the pattern used in built_in_chat_clients.py (wrap in Agent and call await agent.run(...)).
Environment
OS: macOS (Apple Silicon)
Azure Functions Core Tools: 4.9.0 (not required to reproduce this specific error)
Auth: AzureCliCredential (az login)