We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 420ea0a commit f832d99Copy full SHA for f832d99
src/mcp/server/session.py
@@ -240,7 +240,8 @@ async def create_message(
240
max_tokens: Maximum number of tokens to generate.
241
system_prompt: Optional system prompt.
242
include_context: Optional context inclusion setting.
243
- Requires client to have sampling.context capability.
+ Should only be set to "thisServer" or "allServers"
244
+ if the client has sampling.context capability.
245
temperature: Optional sampling temperature.
246
stop_sequences: Optional stop sequences.
247
metadata: Optional metadata to pass through to the LLM provider.
0 commit comments