-
Notifications
You must be signed in to change notification settings - Fork 259
Document mcp-k6 and configuring AI assistants for k6 #2163
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
oleiade
wants to merge
5
commits into
main
Choose a base branch
from
feat/mcp-k6
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
5 commits
Select commit
Hold shift + click to select a range
d517c12
docs: document mcp-k6 and configuring AI assistants
oleiade 0c3eff7
Apply suggestions from code review
oleiade 99d6029
refactor: apply pull request suggestions
oleiade 3d16f7b
docs: document installing mcp-k6 on debian and rpm
oleiade ff7837b
fix: apply pull request suggestions
oleiade File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
140 changes: 140 additions & 0 deletions
140
docs/sources/k6/next/set-up/configure-ai-assistant/_index.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,140 @@ | ||
| --- | ||
| title: Configure your AI assistant | ||
| description: Connect mcp-k6 to your AI assistant or editor to get help writing, validating, and running k6 scripts. | ||
| weight: 110 | ||
| --- | ||
|
|
||
| # Configure your AI assistant | ||
|
|
||
| `mcp-k6` is an experimental [Model Context Protocol](https://modelcontextprotocol.io/) (MCP) server for k6. | ||
| Once connected to your AI assistant or MCP-compatible editor, it helps you write better k6 scripts faster and run them with confidence. | ||
|
|
||
| ## What your assistant can do for you | ||
|
|
||
| With `mcp-k6`, your AI assistant can: | ||
|
|
||
| - **Write accurate scripts:** Create up-to-date scripts by **referring to** embedded k6 documentation and TypeScript definitions to reduce API hallucinations. | ||
| - **Validate scripts:** Catch syntax errors, missing imports, and `export default function` declarations before execution. | ||
| - **Run tests locally:** Execute scripts and review results without leaving your editor. | ||
| - **Generate scripts:** Create tests from requirements using guided prompts that follow k6 best practices. | ||
| - **Convert browser tests:** Transform Playwright tests into k6 browser scripts while preserving test logic. | ||
| - **Automate provisioning:** Discover Terraform resources in your project to automate Grafana Cloud k6 setup. | ||
|
|
||
| ## Install mcp-k6 | ||
|
|
||
| Choose one of the following installation methods. | ||
|
|
||
| ### Docker (recommended) | ||
|
|
||
| Pull the image: | ||
|
|
||
| ```sh | ||
| docker pull grafana/mcp-k6:latest | ||
| docker run --rm grafana/mcp-k6 --version | ||
| ``` | ||
oleiade marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| ### Homebrew (macOS) | ||
|
|
||
oleiade marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| {{< admonition type="note" >}} | ||
| If you run `mcp-k6` natively, you must also have k6 installed and available in your PATH. | ||
| {{< /admonition >}} | ||
|
|
||
| ```sh | ||
| brew tap grafana/grafana | ||
| brew install mcp-k6 | ||
| mcp-k6 --version | ||
| ``` | ||
|
|
||
| ### Linux packages (deb/rpm) | ||
|
|
||
oleiade marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| Install `mcp-k6` from the `.deb` or `.rpm` packages published on the [mcp-k6 GitHub releases](https://github.com/grafana/mcp-k6/releases). | ||
|
|
||
| 1. Open the releases page and select a version. | ||
| 1. Download the package that matches your Linux distribution and CPU architecture. | ||
|
|
||
| You can check your CPU architecture with: | ||
|
|
||
| ```sh | ||
| uname -m | ||
| ``` | ||
|
|
||
| Use the following mapping to pick the right asset: | ||
|
|
||
| | `uname -m` | Debian/Ubuntu asset | Fedora/RHEL asset | | ||
| | --- | --- | --- | | ||
| | `x86_64` | `amd64` (`.deb`) | `x86_64` (`.rpm`) | | ||
| | `aarch64` | `arm64` (`.deb`) | `aarch64` (`.rpm`) | | ||
|
|
||
| #### Debian/Ubuntu (`.deb`) | ||
|
|
||
| If you have the GitHub CLI (`gh`) installed, you can download a specific release asset from the terminal: | ||
|
|
||
| ```sh | ||
| MCP_K6_VERSION="vX.Y.Z" | ||
|
|
||
| # For amd64/x86_64: | ||
| gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*amd64*.deb" | ||
|
|
||
| # For arm64/aarch64: | ||
| # gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*arm64*.deb" | ||
|
|
||
| sudo apt install ./mcp-k6_*.deb | ||
| mcp-k6 --version | ||
| ``` | ||
|
|
||
| If you downloaded the `.deb` in your browser, run `apt` from the directory where you saved it: | ||
|
|
||
| ```sh | ||
| sudo apt install ./mcp-k6_*.deb | ||
| mcp-k6 --version | ||
| ``` | ||
|
|
||
| #### Fedora/RHEL (`.rpm`) | ||
|
|
||
| ```sh | ||
| MCP_K6_VERSION="vX.Y.Z" | ||
|
|
||
| # For x86_64: | ||
| gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*x86_64*.rpm" | ||
|
|
||
| # For aarch64: | ||
| # gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*aarch64*.rpm" | ||
|
|
||
| sudo dnf install ./mcp-k6-*.rpm | ||
| mcp-k6 --version | ||
| ``` | ||
|
|
||
| If your distro uses `yum` instead of `dnf`, run: | ||
|
|
||
| ```sh | ||
| sudo yum install ./mcp-k6-*.rpm | ||
| ``` | ||
|
|
||
| ### Build from source | ||
|
|
||
| Clone and install with `make`: | ||
|
|
||
| ```sh | ||
| git clone https://github.com/grafana/mcp-k6 | ||
| cd mcp-k6 | ||
| make install | ||
| ``` | ||
|
|
||
oleiade marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| ## Troubleshooting | ||
|
|
||
| If your AI assistant cannot connect to the server: | ||
|
|
||
| - **Check the logs:** Most editors (like Cursor or VS Code) have an "MCP Output" or "Logs" tab. Check there for "command not found" errors. | ||
| - **Verify PATH:** If running natively, run `which k6` in your terminal to ensure k6 is globally accessible. | ||
| - **Docker Permissions:** Ensure the Docker daemon is running and that your user has permission to execute `docker run`. | ||
| - **Use MCP Inspector:** Use the [MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) to debug the connection independently of your editor. | ||
|
|
||
| ### Configure your editor | ||
|
|
||
| After you install `mcp-k6`, refer to [Configure MCP clients](./configure-mcp-clients.md) to register the server with your editor and establish a connection. | ||
|
|
||
| - [Configure MCP clients](./configure-mcp-clients/) | ||
|
|
||
| ## Next steps | ||
|
|
||
| - Learn about available tools, prompts, and resources: [Tools, prompts, and resources](./tools-prompts-resources/) | ||
166 changes: 166 additions & 0 deletions
166
docs/sources/k6/next/set-up/configure-ai-assistant/configure-mcp-clients.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,166 @@ | ||
| --- | ||
| title: Configure MCP clients | ||
| description: Configure VS Code, Cursor, Claude Code, Codex, and other MCP clients to launch mcp-k6 over stdio. | ||
| weight: 100 | ||
| --- | ||
|
|
||
| # Configure MCP clients | ||
|
|
||
| `mcp-k6` communicates over **stdio** (stdin/stdout). Your MCP client registers mcp-k6 (or docker run ...) as a subprocess to establish a connection. | ||
|
|
||
| ## Prerequisites | ||
|
|
||
| - If you run `mcp-k6` **natively**, ensure `mcp-k6` and `k6` are available on your `PATH`. | ||
| - If you run `mcp-k6` **in Docker**, ensure Docker is installed and running. | ||
|
|
||
| ## VS Code | ||
|
|
||
| VS Code supports MCP servers through the GitHub Copilot extension. To use `mcp-k6` tools, you must use **Copilot Edits** (agent mode), which allows the assistant to call k6 commands and read test results. | ||
|
|
||
| 1. Open your user or workspace settings JSON (`settings.json`). | ||
| 2. Add the MCP server configuration. | ||
|
|
||
| ### Docker | ||
|
|
||
| ```json | ||
| { | ||
| "mcp": { | ||
| "servers": { | ||
| "k6": { | ||
| "command": "docker", | ||
| "args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"] | ||
| } | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| ### Native | ||
|
|
||
| ```json | ||
| { | ||
| "mcp": { | ||
| "servers": { | ||
| "k6": { | ||
| "command": "mcp-k6" | ||
| } | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| ## Cursor | ||
|
|
||
| Cursor reads MCP server definitions from your configuration. Add an entry to register mcp-k6 as a local MCP server using the stdio transport. | ||
|
|
||
| Create or update your global configuration file (**~/.cursor/mcp.json**) or your project-specific file (**.cursor/mcp.json**): | ||
|
|
||
| ### Docker | ||
|
|
||
| ```json | ||
| { | ||
| "mcpServers": { | ||
| "k6": { | ||
| "command": "docker", | ||
| "args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"] | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| ### Native | ||
|
|
||
| ```json | ||
| { | ||
| "mcpServers": { | ||
| "k6": { | ||
| "command": "mcp-k6" | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| Restart Cursor or reload MCP servers, then verify the connection by invoking the `info` tool from chat. | ||
|
|
||
| ## Claude Code (CLI) | ||
|
|
||
| Add `mcp-k6` to Claude Code using the `claude mcp add` command. | ||
|
|
||
| ### Docker | ||
|
|
||
| ```sh | ||
| claude mcp add --scope=user --transport=stdio k6 docker run --rm -i grafana/mcp-k6:latest | ||
| ``` | ||
|
|
||
| ### Native | ||
|
|
||
| ```sh | ||
| claude mcp add --scope=user --transport=stdio k6 mcp-k6 | ||
| ``` | ||
|
|
||
| Use `--scope=local` to add the configuration to the current project instead of globally. | ||
|
|
||
| Reload the workspace after adding the server. | ||
|
|
||
| ## Codex | ||
|
|
||
| Codex CLI supports MCP servers over stdio. | ||
|
|
||
| 1. Locate your Codex configuration file. | ||
| If you are unsure of the location, run codex help config to find the file path. | ||
| 1. Add the MCP server configuration under the `mcpServers` key. | ||
|
|
||
| ### Docker | ||
|
|
||
| ```json | ||
| { | ||
| "mcpServers": { | ||
| "k6": { | ||
| "command": "docker", | ||
| "args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"] | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| ### Native | ||
|
|
||
| ```json | ||
| { | ||
| "mcpServers": { | ||
| "k6": { | ||
| "command": "mcp-k6" | ||
| } | ||
| } | ||
| } | ||
| ``` | ||
|
|
||
| Restart Codex or reload its configuration to activate the server. | ||
|
|
||
| ## Other MCP clients | ||
|
|
||
| If your MCP client is not in the previous list, you can use mcp-k6 with any client that supports stdio-based MCP servers. | ||
|
|
||
| ### How MCP works | ||
|
|
||
| The Model Context Protocol (MCP) is a standard way for AI assistants to communicate with external tools. At its core, an MCP server is a program that: | ||
|
|
||
| 1. Reads JSON-RPC messages from **stdin**. | ||
| 1. Writes JSON-RPC responses to **stdout**. | ||
| 1. Advertises available tools, resources, and prompts that the AI can use. | ||
|
|
||
| When you configure an MCP client, you define a specific command, the MCP server binary, for the client to launch and communicate with over stdio. The client then sends requests to list tools or execute functions, and the server returns the results. | ||
|
|
||
| ### What you need to configure | ||
|
|
||
| Most MCP clients require the following information: | ||
|
|
||
| | Field | Description | | ||
| |-------|-------------| | ||
| | **name** | An identifier for the server (for example, `k6`). | | ||
| | **command** | The program to run. For native installs, this is `mcp-k6`. For Docker, this is `docker`. | | ||
| | **args** | Command-line arguments. For Docker, include `run`, `--rm`, `-i`, and the image name. | | ||
| | **env** | Optional environment variables to pass to the server. | | ||
| | **transport** | Usually `stdio` (some clients assume this by default). | | ||
|
|
||
| Consult your client's documentation for the exact configuration file location and format. Look for sections about "MCP servers", "tool servers", or "stdio servers". |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.