Skip to content

Conversation

@howieleung
Copy link
Member

This is a POC to be extended for all tests for tools.

  1. some sample such as this AI Search sample accept user input. Now they can be skip if you specify input by env var.
  2. use try catch finally with finally for tear down
  3. assert output at the end.

Copilot AI review requested due to automatic review settings November 21, 2025 21:39
@howieleung howieleung marked this pull request as draft November 21, 2025 21:41
Copilot finished reviewing on behalf of howieleung November 21, 2025 21:42
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces improvements to make samples more testable by adding environment variable support for user input, implementing proper try-catch-finally cleanup patterns, and adding assertions to validate outputs. The PR also reorganizes MCP sample files by moving sample_mcp_tool_async.py to a new mcp_client directory with supporting assets.

Key changes:

  • Added optional environment variable AI_SEARCH_USER_INPUT to skip interactive prompts in the AI Search sample
  • Implemented try-finally pattern for reliable agent cleanup in sample_agent_ai_search.py
  • Added output assertion to validate sample execution success
  • Relocated MCP tool sample to dedicated mcp_client directory with improved documentation

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
sdk/ai/azure-ai-projects/samples/mcp_client/sample_mcp_tool_async.py New location for MCP sample with enhanced documentation and workflow description
sdk/ai/azure-ai-projects/samples/mcp_client/assets/product_info.md Asset file for file search demonstration in MCP sample
sdk/ai/azure-ai-projects/samples/agents/tools/sample_agent_ai_search.py Refactored with main() function, environment variable support, try-finally cleanup, and output assertions
sdk/ai/azure-ai-projects/samples/agents/sample_mcp_tool_async_.py Removed old sample location (appears to be deleted/moved)
sdk/ai/azure-ai-projects/CHANGELOG.md Updated to document the improved MCP client sample

Comment on lines +121 to +127
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
Copy link

Copilot AI Nov 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The file handle opened on line 123 is never closed. While the async with context manager will clean up the session, the file opened with open() should be properly closed to prevent resource leaks.

Consider using a context manager:

with open(
    os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
    "rb",
) as f:
    vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
        vector_store_id=vector_store.id,
        file=f,
    )
Suggested change
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
with open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
) as f:
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=f,
)

Copilot uses AI. Check for mistakes.
Comment on lines +116 to +137
# Create a vector store
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)

vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)

print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")

# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")
Copy link

Copilot AI Nov 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The sample creates a vector store but never cleans it up. This can lead to resource accumulation over time. Consider adding cleanup in a try-finally block to ensure the vector store is deleted even if an error occurs, similar to how other samples handle agent cleanup.

Example pattern:

vector_store = None
try:
    vector_store = await openai_client.vector_stores.create(...)
    # ... use vector store ...
finally:
    if vector_store:
        await openai_client.vector_stores.delete(vector_store_id=vector_store.id)
Suggested change
# Create a vector store
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")
# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")
# Create a vector store and ensure cleanup
vector_store = None
try:
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")
# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")
finally:
if vector_store:
await openai_client.vector_stores.delete(vector_store_id=vector_store.id)

Copilot uses AI. Check for mistakes.
Comment on lines +71 to +139

async with (
DefaultAzureCredential() as credential,
AIProjectClient(endpoint=endpoint, credential=credential) as project_client,
project_client.get_openai_client() as openai_client,
streamablehttp_client(
url=f"{endpoint}/mcp_tools?api-version=2025-05-15-preview",
headers={"Authorization": f"Bearer {(await credential.get_token('https://ai.azure.com')).token}"},
) as (read_stream, write_stream, _),
ClientSession(read_stream, write_stream) as session,
):

# Initialize the connection
await session.initialize()
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")

# For each tool, print its details
for tool in tools.tools:
print(f"\n\nTool Name: {tool.name}, Input Schema: {tool.inputSchema}")

# Run the code interpreter tool
code_interpreter_result = await session.call_tool(
name="code_interpreter",
arguments={"code": "print('Hello from Microsoft Foundry MCP Code Interpreter tool!')"},
)
print(f"\n\nCode Interpreter Output: {code_interpreter_result.content}")

# Run the image_generation tool
image_generation_result = await session.call_tool(
name="image_generation",
arguments={"prompt": "Draw a cute puppy riding a skateboard"},
meta={"imagegen_model_deployment_name": os.getenv("IMAGE_GEN_DEPLOYMENT_NAME", "")},
)

# Save the image generation output to a file
if image_generation_result.content and isinstance(image_generation_result.content[0], ImageContent):
filename = "puppy.png"
file_path = os.path.abspath(filename)
print(f"\nImage saved to: {file_path}")

with open(file_path, "wb") as f:
f.write(base64.b64decode(image_generation_result.content[0].data))

# Create a vector store
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)

vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)

print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")

# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")


Copy link

Copilot AI Nov 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The sample creates a file (puppy.png) but doesn't clean it up afterwards. For a sample that may be run multiple times, consider cleaning up generated artifacts.

You could add cleanup after the sample completes:

try:
    # ... existing code ...
finally:
    # Clean up generated files
    if os.path.exists("puppy.png"):
        os.remove("puppy.png")
Suggested change
async with (
DefaultAzureCredential() as credential,
AIProjectClient(endpoint=endpoint, credential=credential) as project_client,
project_client.get_openai_client() as openai_client,
streamablehttp_client(
url=f"{endpoint}/mcp_tools?api-version=2025-05-15-preview",
headers={"Authorization": f"Bearer {(await credential.get_token('https://ai.azure.com')).token}"},
) as (read_stream, write_stream, _),
ClientSession(read_stream, write_stream) as session,
):
# Initialize the connection
await session.initialize()
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
# For each tool, print its details
for tool in tools.tools:
print(f"\n\nTool Name: {tool.name}, Input Schema: {tool.inputSchema}")
# Run the code interpreter tool
code_interpreter_result = await session.call_tool(
name="code_interpreter",
arguments={"code": "print('Hello from Microsoft Foundry MCP Code Interpreter tool!')"},
)
print(f"\n\nCode Interpreter Output: {code_interpreter_result.content}")
# Run the image_generation tool
image_generation_result = await session.call_tool(
name="image_generation",
arguments={"prompt": "Draw a cute puppy riding a skateboard"},
meta={"imagegen_model_deployment_name": os.getenv("IMAGE_GEN_DEPLOYMENT_NAME", "")},
)
# Save the image generation output to a file
if image_generation_result.content and isinstance(image_generation_result.content[0], ImageContent):
filename = "puppy.png"
file_path = os.path.abspath(filename)
print(f"\nImage saved to: {file_path}")
with open(file_path, "wb") as f:
f.write(base64.b64decode(image_generation_result.content[0].data))
# Create a vector store
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")
# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")
try:
async with (
DefaultAzureCredential() as credential,
AIProjectClient(endpoint=endpoint, credential=credential) as project_client,
project_client.get_openai_client() as openai_client,
streamablehttp_client(
url=f"{endpoint}/mcp_tools?api-version=2025-05-15-preview",
headers={"Authorization": f"Bearer {(await credential.get_token('https://ai.azure.com')).token}"},
) as (read_stream, write_stream, _),
ClientSession(read_stream, write_stream) as session,
):
# Initialize the connection
await session.initialize()
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
# For each tool, print its details
for tool in tools.tools:
print(f"\n\nTool Name: {tool.name}, Input Schema: {tool.inputSchema}")
# Run the code interpreter tool
code_interpreter_result = await session.call_tool(
name="code_interpreter",
arguments={"code": "print('Hello from Microsoft Foundry MCP Code Interpreter tool!')"},
)
print(f"\n\nCode Interpreter Output: {code_interpreter_result.content}")
# Run the image_generation tool
image_generation_result = await session.call_tool(
name="image_generation",
arguments={"prompt": "Draw a cute puppy riding a skateboard"},
meta={"imagegen_model_deployment_name": os.getenv("IMAGE_GEN_DEPLOYMENT_NAME", "")},
)
# Save the image generation output to a file
if image_generation_result.content and isinstance(image_generation_result.content[0], ImageContent):
filename = "puppy.png"
file_path = os.path.abspath(filename)
print(f"\nImage saved to: {file_path}")
with open(file_path, "wb") as f:
f.write(base64.b64decode(image_generation_result.content[0].data))
# Create a vector store
vector_store = await openai_client.vector_stores.create(
name="sample_vector_store",
)
vector_store_file = await openai_client.vector_stores.files.upload_and_poll(
vector_store_id=vector_store.id,
file=open(
os.path.abspath(os.path.join(os.path.dirname(__file__), "./assets/product_info.md")),
"rb",
),
)
print(f"\n\nUploaded file, file ID: {vector_store_file.id} to vector store ID: {vector_store.id}")
# Call the file_search tool
file_search_result = await session.call_tool(
name="file_search",
arguments={"queries": ["What feature does Smart Eyewear offer?"]},
meta={"vector_store_ids": [vector_store.id]},
)
print(f"\n\nFile Search Output: {file_search_result.content}")
finally:
# Clean up generated files
puppy_file = "puppy.png"
if os.path.exists(puppy_file):
os.remove(puppy_file)

Copilot uses AI. Check for mistakes.
extra_body={"agent": {"name": agent.name, "type": "agent_reference"}},
)

output = None
Copy link

Copilot AI Nov 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The output variable is assigned to None on line 102 but then conditionally assigned within the event loop. If the loop completes without encountering a response.completed event, output will remain None, causing the assertion on line 138 to fail.

This initialization should be removed since output is already initialized as None at the module level (line 45), and the code should ensure the assertion message is more descriptive about what went wrong.

Copilot uses AI. Check for mistakes.
Comment on lines +127 to +128
print(f"Error occurred: {e}")
raise e
Copy link

Copilot AI Nov 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The exception is caught and printed but then re-raised, making the error message on line 127 redundant since the exception will propagate anyway with its original message. Consider either:

  1. Removing the print statement and just re-raising the exception
  2. Logging the error properly instead of printing
  3. Not re-raising if you want to handle it gracefully

Example:

except Exception as e:
    # Let the exception propagate with its original traceback
    raise
Suggested change
print(f"Error occurred: {e}")
raise e
raise

Copilot uses AI. Check for mistakes.
Before running the sample:
pip install "azure-ai-projects>=2.0.0b1" azure-identity python-dotenv mcp
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

azure-identity non longer needed in this line after my PR goes in.

)
)
# [END tool_declaration]

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need to refresh code snippets in package README.md?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants