-
-
Notifications
You must be signed in to change notification settings - Fork 66
Open
Description
Hi there π,
wonderful project! It serves as a great dev environment to build MCP servers.
That being said, I noticed one of the newer features missing, mainly sampling.
Sampling allows an MCP server to run inference on the client. Kind of "bring your own LLM" in an agentic world.
FastMCP has a fairly straight forward client integration: https://gofastmcp.com/clients/sampling
The handler could include a HIL, even though those tokens will not be injected in the client chat.
It might "cost you inference/money" but there is no token injection risk.
If you are curious about this feature, I'd be happy to provide a patch.
Best!
Joe
PS: Do you accept donations?
jonigl
Metadata
Metadata
Assignees
Labels
No labels