Skip to content

Commit 620f797

Browse files
committed
docs: reorganize README with clearer usage options
- Restructure Hands-On Usage section into three clear options: Option 1: Manual Copy-Paste (No Tooling Required) Option 2: Native Slash Commands (Recommended) Option 3: MCP Server (Advanced) - Add git URL examples for running via uvx --from - Improve organization from simplest to most automated - Add PyPI notes for future simplified syntax Provides a clearer progression for users choosing their preferred workflow integration method.
1 parent 21cd658 commit 620f797

File tree

1 file changed

+35
-6
lines changed

1 file changed

+35
-6
lines changed

README.md

Lines changed: 35 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -115,21 +115,40 @@ sequenceDiagram
115115
- **Status Keys:** `[ ]` not started, `[~]` in progress, `[x]` complete, mirroring the manage-tasks guidance.
116116
- **Proof Artifacts:** URLs, CLI commands, screenshots, or tests captured per task to demonstrate working software.
117117

118-
## Hands-On Usage (No MCP Required)
118+
## Hands-On Usage
119+
120+
The SDD workflow can be used in three ways, from simplest to most automated:
121+
122+
### Option 1: Manual Copy-Paste (No Tooling Required)
119123

120124
1. **Kick off a spec:** Copy or reference `prompts/generate-spec.md` inside your preferred AI chat. Provide the feature idea, answer the clarifying questions, and review the generated spec before saving it under `/tasks`.
121125
2. **Plan the work:** Point the assistant to the new spec and walk through `prompts/generate-task-list-from-spec.md`. Approve parent tasks first, then request the detailed subtasks and relevant files. Commit the result to `/tasks`.
122126
3. **Execute with discipline:** Follow `prompts/manage-tasks.md` while implementing. Update statuses as you work, attach proof artifacts, and pause for reviews at each demoable slice.
123127

124-
### Slash Command Integration (TBD)
128+
### Option 2: Native Slash Commands (Recommended)
129+
130+
Generate slash commands for your AI coding assistant and use the prompts as native commands:
131+
132+
```bash
133+
# Clone and install locally
134+
git clone https://github.com/liatrio-labs/spec-driven-workflow-mcp.git
135+
cd spec-driven-workflow-mcp
136+
uv sync
137+
uv run sdd-generate-commands --yes
138+
139+
# Or run directly from the git repo via uvx
140+
uvx --from git+https://github.com/liatrio-labs/spec-driven-workflow-mcp sdd-generate-commands --yes
141+
```
142+
143+
This will auto-detect your configured AI assistants (Claude Code, Cursor, Windsurf, etc.) and generate command files in your home directory.
125144

126-
Guides are coming for wiring these prompts as first-class slash commands in popular IDEs and AI tools (Windsurf, VS Code, Cursor, Claude Code, Codex, and more).
145+
**Note**: Once available on PyPI, you'll be able to run `uvx spec-driven-development-mcp sdd-generate-commands --yes` for a one-liner installation.
127146

128-
See [docs/slash-command-generator.md](./docs/slash-command-generator.md) for details on generating command files for AI assistants.
147+
See [docs/slash-command-generator.md](./docs/slash-command-generator.md) for details.
129148

130-
## Optional: Automate with the MCP Server
149+
### Option 3: MCP Server (Advanced)
131150

132-
Prefer tighter tooling? This repository also ships an MCP server that exposes the same prompts programmatically. Treat it as an accelerator—everything above works without it.
151+
Run the prompts as an MCP server for programmatic access. This option is most useful for custom integrations and tools that support MCP.
133152

134153
> Note: MCP prompt support is not uniformly supported across AI tools. See [docs/mcp-prompt-support.md](./docs/mcp-prompt-support.md) for details.
135154
@@ -155,7 +174,11 @@ uv sync
155174
**STDIO (local development):**
156175

157176
```bash
177+
# From local clone
158178
uvx fastmcp run server.py
179+
180+
# Or run directly from the git repo via uvx
181+
uvx --from git+https://github.com/liatrio-labs/spec-driven-workflow-mcp spec-driven-development-mcp
159182
```
160183

161184
**With MCP Inspector:**
@@ -167,9 +190,15 @@ uvx fastmcp dev server.py
167190
**HTTP Transport:**
168191

169192
```bash
193+
# Use fastmcp CLI for HTTP transport
170194
uvx fastmcp run server.py --transport http --port 8000
195+
196+
# Or run directly from the git repo via uvx
197+
uvx --from git+https://github.com/liatrio-labs/spec-driven-workflow-mcp spec-driven-development-mcp --transport http --port 8000
171198
```
172199

200+
**Note**: Once available on PyPI, you'll be able to run `uvx spec-driven-development-mcp` for a one-liner installation with optional `--transport` and `--port` arguments. The `fastmcp run` approach remains available for development and advanced options.
201+
173202
See [docs/operations.md](docs/operations.md) and [CONTRIBUTING.md](CONTRIBUTING.md) for advanced configuration, deployment, and contribution guidelines.
174203

175204
## References

0 commit comments

Comments
 (0)