Skip to content

Commit 7d844d5

Browse files
committed
docs: add MCP integration documentation to README
1 parent d108739 commit 7d844d5

File tree

1 file changed

+42
-0
lines changed

1 file changed

+42
-0
lines changed

README.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ A modern C++20 library providing a unified interface for Large Language Model AP
2121
- **📝 Flexible input**: Support for both simple prompts and structured context
2222
- **🎯 Type-safe models**: Strongly typed Model enum for compile-time safety
2323
- **📊 Performance benchmarks**: Comprehensive model comparison and cost analysis
24+
- **🔌 MCP Integration**: Model Context Protocol support for external tool integration
2425

2526
## Quick Start
2627

@@ -366,6 +367,7 @@ target_link_libraries(my_target PRIVATE llmcpp)
366367
- **`LLMRequest`/`LLMResponse`**: Unified request/response types with flexible input mapping
367368
- **`ClientManager`**: Smart pointer-based client management
368369
- **`LLMRequestConfig`**: Configuration for models and parameters
370+
- **`OpenAIMcpUtils`**: Model Context Protocol utilities for external tool integration
369371

370372
### Request Structure
371373

@@ -390,6 +392,7 @@ This design allows for:
390392
- Function calling and tool usage
391393
- Error handling and usage tracking
392394
- Flexible input mapping (prompt → instructions, context → input)
395+
- Model Context Protocol (MCP) integration
393396

394397
## Building
395398

@@ -645,6 +648,45 @@ auto stringArray = JsonSchemaBuilder::arrayOf(JsonSchemaBuilder::string());
645648
auto statusEnum = JsonSchemaBuilder::stringEnum({"active", "inactive", "pending"});
646649
```
647650

651+
### Model Context Protocol (MCP) Integration
652+
653+
llmcpp includes utilities for integrating external tools via the Model Context Protocol:
654+
655+
```cpp
656+
#include <openai/OpenAIMcpUtils.h>
657+
658+
// Convert MCP tools to OpenAI tool definitions
659+
std::vector<json> mcpTools = getMcpToolsFromServer();
660+
std::vector<OpenAI::ToolDefinition> openaiTools = OpenAI::convertMcpToolsToOpenAI(mcpTools);
661+
662+
// Add MCP tools to your request
663+
LLMRequestConfig config;
664+
config.model = "gpt-4o-mini";
665+
config.tools = openaiTools;
666+
667+
LLMRequest request(config, "Your prompt here");
668+
auto response = client.sendRequest(request);
669+
670+
// Handle tool calls
671+
if (response.hasToolCalls()) {
672+
for (const auto& toolCall : response.getToolCalls()) {
673+
// Execute MCP tool and get result
674+
json toolResult = executeMcpTool(toolCall.name, toolCall.arguments);
675+
676+
// Send result back to continue conversation
677+
// ...
678+
}
679+
}
680+
```
681+
682+
**MCP Features:**
683+
- **Tool Discovery**: Automatically convert MCP tool definitions to OpenAI format
684+
- **Type Mapping**: Seamless conversion between MCP and OpenAI schemas
685+
- **Tool Execution**: Easy integration with MCP servers
686+
- **Error Handling**: Robust error handling for MCP operations
687+
688+
For more details on MCP integration, see the [MCP integration tests](tests/integration/test_mcp_integration.cpp).
689+
648690
## Testing
649691
650692
### Unit Tests

0 commit comments

Comments
 (0)