Skip to content

Commit 94bc438

Browse files
committed
point documentation changes to "draft"
1 parent e16629e commit 94bc438

File tree

2 files changed

+14
-14
lines changed

2 files changed

+14
-14
lines changed

docs/specification/2024-11-05/client/sampling.md

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ weight: 40
88
**Protocol Revision**: {{< param protocolRevision >}}
99
{{< /callout >}}
1010

11-
The Model Context Protocol (MCP) provides a standardized way for servers to request LLM sampling ("completions" or "generations") from language models via clients. This flow allows clients to maintain control over model access, selection, and permissions while enabling servers to leverage AI capabilities&mdash;with no server API keys necessary. Servers can request text, audio, or image-based interactions and optionally include context from MCP servers in their prompts.
11+
The Model Context Protocol (MCP) provides a standardized way for servers to request LLM sampling ("completions" or "generations") from language models via clients. This flow allows clients to maintain control over model access, selection, and permissions while enabling servers to leverage AI capabilities&mdash;with no server API keys necessary. Servers can request text or image-based interactions and optionally include context from MCP servers in their prompts.
1212

1313
## User Interaction Model
1414

@@ -142,16 +142,6 @@ Sampling messages can contain:
142142
}
143143
```
144144

145-
#### Audio Content
146-
```json
147-
{
148-
"type": "audio",
149-
"data": "base64-encoded-audio-data",
150-
"mimeType": "audio/wav"
151-
}
152-
```
153-
154-
155145
### Model Preferences
156146

157147
Model selection in MCP requires careful abstraction since servers and clients may use different AI providers with distinct model offerings. A server cannot simply request a specific model by name since the client may not have access to that exact model or may prefer to use a different provider's equivalent model.

docs/specification/draft/client/sampling.md

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,10 @@ weight: 40
55
---
66

77
{{< callout type="info" >}}
8-
**Protocol Revision**: draft
8+
**Protocol Revision**: {{< param protocolRevision >}}
99
{{< /callout >}}
1010

11-
The Model Context Protocol (MCP) provides a standardized way for servers to request LLM sampling ("completions" or "generations") from language models via clients. This flow allows clients to maintain control over model access, selection, and permissions while enabling servers to leverage AI capabilities&mdash;with no server API keys necessary. Servers can request text or image-based interactions and optionally include context from MCP servers in their prompts.
11+
The Model Context Protocol (MCP) provides a standardized way for servers to request LLM sampling ("completions" or "generations") from language models via clients. This flow allows clients to maintain control over model access, selection, and permissions while enabling servers to leverage AI capabilities&mdash;with no server API keys necessary. Servers can request text, audio, or image-based interactions and optionally include context from MCP servers in their prompts.
1212

1313
## User Interaction Model
1414

@@ -27,7 +27,7 @@ Implementations are free to expose sampling through any interface pattern that s
2727

2828
## Capabilities
2929

30-
Clients that support sampling **MUST** declare the `sampling` capability during [initialization]({{< ref "/specification/draft/basic/lifecycle#initialization" >}}):
30+
Clients that support sampling **MUST** declare the `sampling` capability during [initialization]({{< ref "/specification/2024-11-05/basic/lifecycle#initialization" >}}):
3131

3232
```json
3333
{
@@ -142,6 +142,16 @@ Sampling messages can contain:
142142
}
143143
```
144144

145+
#### Audio Content
146+
```json
147+
{
148+
"type": "audio",
149+
"data": "base64-encoded-audio-data",
150+
"mimeType": "audio/wav"
151+
}
152+
```
153+
154+
145155
### Model Preferences
146156

147157
Model selection in MCP requires careful abstraction since servers and clients may use different AI providers with distinct model offerings. A server cannot simply request a specific model by name since the client may not have access to that exact model or may prefer to use a different provider's equivalent model.

0 commit comments

Comments
 (0)