Skip to content

Commit 46f6e0a

Browse files
authored
Merge branch 'main' into main-sse-disconnect
2 parents 7de20f5 + 8b58386 commit 46f6e0a

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -619,7 +619,7 @@ server = Server("example-server", lifespan=server_lifespan)
619619
# Access lifespan context in handlers
620620
@server.call_tool()
621621
async def query_db(name: str, arguments: dict) -> list:
622-
ctx = server.request_context
622+
ctx = server.get_context()
623623
db = ctx.lifespan_context["db"]
624624
return await db.query(arguments["query"])
625625
```

examples/clients/simple-chatbot/README.MD

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
2525
```plaintext
2626
LLM_API_KEY=your_api_key_here
2727
```
28+
**Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters.
2829

2930
3. **Configure servers:**
3031

0 commit comments

Comments
 (0)