Skip to content

Commit ed5bb56

Browse files
Enhance documentation on MCP tools
1 parent 3355dad commit ed5bb56

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

docs/ai/microsoft-extensions-ai.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,8 @@ The preceding code:
100100
- Calls `GetStreamingResponseAsync` on the client, passing a prompt and a list of tools that includes a function created with <xref:Microsoft.Extensions.AI.AIFunctionFactory.Create*>.
101101
- Iterates over the response, printing each update to the console.
102102

103+
You can also use Model Context Protocol (MCP) tools with your `IChatClient`. For more details, see the [build a minimal MCP client](./quickstarts/build-mcp-client.md) article.
104+
103105
#### Cache responses
104106

105107
If you're familiar with [Caching in .NET](../core/extensions/caching.md), it's good to know that <xref:Microsoft.Extensions.AI> provides other such delegating `IChatClient` implementations. The <xref:Microsoft.Extensions.AI.DistributedCachingChatClient> is an `IChatClient` that layers caching around another arbitrary `IChatClient` instance. When a novel chat history is submitted to the `DistributedCachingChatClient`, it forwards it to the underlying client and then caches the response before sending it back to the consumer. The next time the same history is submitted, such that a cached response can be found in the cache, the `DistributedCachingChatClient` returns the cached response rather than forwarding the request along the pipeline.

0 commit comments

Comments
 (0)