You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
<h1align="center">Model Context Shell</h1>
2
2
3
-
<palign="center"><b>Unix-style pipelines for MCP tools -compose complex tool workflows as single pipeline requests</b></p>
3
+
<palign="center"><b>Unix-style pipelines for MCP tools -compose complex tool workflows as a single tool call</b></p>
4
4
5
5
<palign="center">
6
6
<ahref="#introduction">Introduction</a> ·
@@ -114,7 +114,7 @@ The agent constructs pipelines as JSON arrays of stages. Data flows from one sta
114
114
115
115
Any tool stage can set `"for_each": true` to process items one-by-one. The preceding stage must output JSONL (one JSON object per line), and the tool is called once per line. Results are collected into an array. So "fetch a list of URLs, then fetch each one" is a single pipeline call, using a single reused connection.
116
116
117
-
Full example -fetch users, extract their profile URLs, fetch each profile, filter for active users:
117
+
Full example -fetch users, extract their profile URLs, fetch each profile, filter for active users:
118
118
119
119
```json
120
120
[
@@ -129,7 +129,7 @@ Full example -fetch users, extract their profile URLs, fetch each profile, filte
129
129
130
130
### Prerequisites
131
131
132
-
-[ToolHive](https://stacklok.com/download/) (`thv`) -a runtime for managing MCP servers
132
+
-[ToolHive](https://stacklok.com/download/) (`thv`) -a runtime for managing MCP servers
133
133
134
134
### Quick start
135
135
@@ -227,13 +227,13 @@ uv run pyright
227
227
228
228
## Specification
229
229
230
-
For now, this project serves as a living specification -the implementation _is_ the spec. A more formal specification may be extracted later.
230
+
For now, this project serves as a living specification -the implementation _is_ the spec. A more formal specification may be extracted later.
231
231
232
232
**Execution model.** The current execution model is a scriptable map-reduce pipeline. Stages run sequentially, with `for_each` providing the map step over tool calls. This could be extended with a more generic mini-interpreter, but it probably shouldn't grow into a full programming language. Past a certain complexity, it makes more sense for agents to write code directly, or combine written code with the shell approach. That said, built-in access to tools like `jq` and `awk` already makes the pipeline model pretty capable for most data transformation tasks.
233
233
234
234
**Pipeline schema.** The pipeline stages are defined as typed Pydantic models in [`models.py`](https://github.com/StacklokLabs/model-context-shell/blob/main/models.py). FastMCP generates a discriminated-union JSON Schema from these models, so MCP clients can validate pipelines before sending them.
235
235
236
-
**ToolHive and security.** The reliance on ToolHive and container isolation is a practical choice -it was the simplest way to get a working, secure system. ToolHive handles tool discovery, container management, and networking, which lets this project focus on the pipeline execution model itself. A different deployment model could be used without changing the core concept.
236
+
**ToolHive and security.** The reliance on ToolHive and container isolation is a practical choice -it was the simplest way to get a working, secure system. ToolHive handles tool discovery, container management, and networking, which lets this project focus on the pipeline execution model itself. A different deployment model could be used without changing the core concept.
0 commit comments