Skip to content

ai-developer-guide-mcp 0.1.6

Install from the command line:
Learn more about npm packages
$ npm install @dwmkerr/ai-developer-guide-mcp@0.1.6
Install via package.json:
"@dwmkerr/ai-developer-guide-mcp": "0.1.6"

About this version

AI Developer Guide MCP Server

A simple MCP server that connects your LLM to the developer guide:

┌──────────┐  GitHub Action  ┌─────────────────────┐
│   Repo   │────────────────▶│ Developer Guide     │
└──────────┘                 │ (GH Pages)          │
                             └─────────────────────┘
                                        ▲
                                        │ JSON APIs
┌──────────┐       MCP       ┌─────────────────────┐
│   LLM    │────────────────▶│ AI Dev Guide        │
└──────────┘                 │ MCP Server          │
                             └─────────────────────┘
  • The developer guide is exposed as a set of JSON files on a server
  • In this case, via GitHub pages
  • These files can be loaded via HTTP GET, which makes them simple APIs
  • The MCP server hits the APIs to get guide content

Quickstart

Use the following MCP configuration:

"ai-developer-guide": {
  "command": "npx",
  "args": ["-y", "@dwmkerr/ai-developer-guide-mcp"]
}

Try a prompt like:

"Read the ai developer guide and tell me what the guiding principles are"

Usage

Run make to see useful commands:

$ make

build: Build the code for distribution help: Show help for each of the Makefile recipes init: Install dependencies lint-fix: Lint and fix lint: Lint code start: Build the code for distribution test: Run unit tests and output coverage to artifacts/coverage

Other commands that might be helpfule are:

# Start in live-reload mode.
npm run dev

# Run tests.
npm run test

# Check connectivity.
npm run start -- check

To connect to your locally running MCP server:

"ai-developer-guide-local": {
  "command": "node",
  "args": [
    "/Users/Dave_Kerr/repos/github/dwmkerr/ai-developer-guide/mcp/ai-developer-guide-mcp/dist/cli.js",
    "start"
  ],
  "cwd": "/Users/Dave_Kerr/repos/github/dwmkerr/ai-developer-guide/mcp/ai-developer-guide-mcp"
}

MCP Inspector

You can quickly test the MCP server using the MCP Inspector:

# Run the latest release of the MCP server:
npx @modelcontextprotocol/inspector -- npx @dwmkerr/ai-developer-guide-mcp
# open: http://127.0.0.1:6274

# Or run your locally built version:
npm run build
npx @modelcontextprotocol/inspector node ./dist/cli.js

MCP Inspector Screenshot

Logging

The server logs all activity to stderr (standard error) following MCP conventions.

Available Tools

When connected to an LLM via MCP, the following tools are available:

  • fetch_main_guide - Get the core AI Developer Guide content
  • fetch_deep_dive - Get specialized guides (Python, Shell Scripts, Make, PostgreSQL, etc.)
  • list_available_guides - List all available deep dive topics

Example LLM Interactions

Once connected to your LLM (like Claude in Cursor), you can ask questions like these:

Getting Started

"What are the main principles in the AI Developer Guide?"

The LLM will use fetch_main_guide to get the core development principles and Plan/Implement/Review approach.

Language-Specific Guidance

"Show me Python best practices for AI-assisted development"

"What are the shell scripting guidelines from the developer guide?"

The LLM will use fetch_deep_dive with category languages and topics like python or shell-scripts.

Tool and Pattern Guidance

"How should I structure my Makefiles according to the guide?"

"What CI/CD practices does the guide recommend?"

The LLM will fetch guides for patterns/make or others/cicd.

Discovery and Exploration

"What deep dive guides are available?"

"List all the specialized guides you have access to"

The LLM will use list_available_guides to show all categories and topics.

Practical Scenarios

"I'm setting up a new Python project with PostgreSQL. What guidance does the developer guide provide?"

The LLM will fetch multiple guides (languages/python and platforms/postgresql) to give comprehensive advice.

"Help me review this shell script using the developer guide principles"

The LLM will get the main guide for review principles, then the shell scripts deep dive for specific best practices.

Configuration

You can point the server to your own AI Developer Guide deployment:

# Set the base url via env var:
export AI_DEVELOPER_GUIDE_URL="https://your-domain.com/your-guide"
ai-developer-guide-mcp start

# or via a CLI parameter:
ai-developer-guide-mcp start --base-url "https://your-domain.com/your-guide"

You can configure your MCP server using these parameters:

{
  "mcpServers": {
    "ai-developer-guide": {
      "command": "ai-developer-guide-mcp",
      "args": ["start", "--base-url", "https://your-domain.com/your-guide"]
    }
  }
}

API Requirements

The API structure should match the AI Developer Guide API format.

Details


Assets

  • ai-developer-guide-mcp-0.1.6.tgz

Download activity

  • Total downloads 0
  • Last 30 days 0
  • Last week 0
  • Today 0