Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/sources/k6/next/reference/integrations.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,12 @@ Codeless tools to speed up the test creation.

- [Test Builder](https://grafana.com/docs/k6/<K6_VERSION>/using-k6/test-authoring/test-builder) - Inspired by the Postman API Builder. Codeless UI tool to generate a k6 test quickly.

## AI assistants (MCP)

MCP (Model Context Protocol) integrations help you author and run k6 scripts from MCP-compatible editors and assistants.

- To validate and run scripts, browse documentation, and generate tests using an AI assistant, refer to [Configure your AI assistant](https://grafana.com/docs/k6/<K6_VERSION>/set-up/configure-ai-assistant/).

## IDE extensions

Code k6 scripts in your IDE of choice. Empower your development workflow with IDE extensions.
Expand Down
140 changes: 140 additions & 0 deletions docs/sources/k6/next/set-up/configure-ai-assistant/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
---
title: Configure your AI assistant
description: Connect mcp-k6 to your AI assistant or editor to get help writing, validating, and running k6 scripts.
weight: 110
---

# Configure your AI assistant

`mcp-k6` is an experimental [Model Context Protocol](https://modelcontextprotocol.io/) (MCP) server for k6.
Once connected to your AI assistant or MCP-compatible editor, it helps you write better k6 scripts faster and run them with confidence.

## What your assistant can do for you

With `mcp-k6`, your AI assistant can:

- **Write accurate scripts:** Create up-to-date scripts by **referring to** embedded k6 documentation and TypeScript definitions to reduce API hallucinations.
- **Validate scripts:** Catch syntax errors, missing imports, and `export default function` declarations before execution.
- **Run tests locally:** Execute scripts and review results without leaving your editor.
- **Generate scripts:** Create tests from requirements using guided prompts that follow k6 best practices.
- **Convert browser tests:** Transform Playwright tests into k6 browser scripts while preserving test logic.
- **Automate provisioning:** Discover Terraform resources in your project to automate Grafana Cloud k6 setup.

## Install mcp-k6

Choose one of the following installation methods.

### Docker (recommended)

Pull the image:

```sh
docker pull grafana/mcp-k6:latest
docker run --rm grafana/mcp-k6 --version
```

### Homebrew (macOS)

{{< admonition type="note" >}}
If you run `mcp-k6` natively, you must also have k6 installed and available in your PATH.
{{< /admonition >}}

```sh
brew tap grafana/grafana
brew install mcp-k6
mcp-k6 --version
```

### Linux packages (deb/rpm)

Install `mcp-k6` from the `.deb` or `.rpm` packages published on the [mcp-k6 GitHub releases](https://github.com/grafana/mcp-k6/releases).

1. Open the releases page and select a version.
1. Download the package that matches your Linux distribution and CPU architecture.

You can check your CPU architecture with:

```sh
uname -m
```

Use the following mapping to pick the right asset:

| `uname -m` | Debian/Ubuntu asset | Fedora/RHEL asset |
| --- | --- | --- |
| `x86_64` | `amd64` (`.deb`) | `x86_64` (`.rpm`) |
| `aarch64` | `arm64` (`.deb`) | `aarch64` (`.rpm`) |

#### Debian/Ubuntu (`.deb`)

If you have the GitHub CLI (`gh`) installed, you can download a specific release asset from the terminal:

```sh
MCP_K6_VERSION="vX.Y.Z"

# For amd64/x86_64:
gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*amd64*.deb"

# For arm64/aarch64:
# gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*arm64*.deb"

sudo apt install ./mcp-k6_*.deb
mcp-k6 --version
```

If you downloaded the `.deb` in your browser, run `apt` from the directory where you saved it:

```sh
sudo apt install ./mcp-k6_*.deb
mcp-k6 --version
```

#### Fedora/RHEL (`.rpm`)

```sh
MCP_K6_VERSION="vX.Y.Z"

# For x86_64:
gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*x86_64*.rpm"

# For aarch64:
# gh release download "$MCP_K6_VERSION" --repo grafana/mcp-k6 --pattern "*linux*aarch64*.rpm"

sudo dnf install ./mcp-k6-*.rpm
mcp-k6 --version
```

If your distro uses `yum` instead of `dnf`, run:

```sh
sudo yum install ./mcp-k6-*.rpm
```

### Build from source

Clone and install with `make`:

```sh
git clone https://github.com/grafana/mcp-k6
cd mcp-k6
make install
```

## Troubleshooting

If your AI assistant cannot connect to the server:

- **Check the logs:** Most editors (like Cursor or VS Code) have an "MCP Output" or "Logs" tab. Check there for "command not found" errors.
- **Verify PATH:** If running natively, run `which k6` in your terminal to ensure k6 is globally accessible.
- **Docker Permissions:** Ensure the Docker daemon is running and that your user has permission to execute `docker run`.
- **Use MCP Inspector:** Use the [MCP Inspector](https://modelcontextprotocol.io/docs/tools/inspector) to debug the connection independently of your editor.

### Configure your editor

After you install `mcp-k6`, refer to [Configure MCP clients](./configure-mcp-clients.md) to register the server with your editor and establish a connection.

- [Configure MCP clients](./configure-mcp-clients/)

## Next steps

- Learn about available tools, prompts, and resources: [Tools, prompts, and resources](./tools-prompts-resources/)
Original file line number Diff line number Diff line change
@@ -0,0 +1,166 @@
---
title: Configure MCP clients
description: Configure VS Code, Cursor, Claude Code, Codex, and other MCP clients to launch mcp-k6 over stdio.
weight: 100
---

# Configure MCP clients

`mcp-k6` communicates over **stdio** (stdin/stdout). Your MCP client registers mcp-k6 (or docker run ...) as a subprocess to establish a connection.

## Prerequisites

- If you run `mcp-k6` **natively**, ensure `mcp-k6` and `k6` are available on your `PATH`.
- If you run `mcp-k6` **in Docker**, ensure Docker is installed and running.

## VS Code

VS Code supports MCP servers through the GitHub Copilot extension. To use `mcp-k6` tools, you must use **Copilot Edits** (agent mode), which allows the assistant to call k6 commands and read test results.

1. Open your user or workspace settings JSON (`settings.json`).
2. Add the MCP server configuration.

### Docker

```json
{
"mcp": {
"servers": {
"k6": {
"command": "docker",
"args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"]
}
}
}
}
```

### Native

```json
{
"mcp": {
"servers": {
"k6": {
"command": "mcp-k6"
}
}
}
}
```

## Cursor

Cursor reads MCP server definitions from your configuration. Add an entry to register mcp-k6 as a local MCP server using the stdio transport.

Create or update your global configuration file (**~/.cursor/mcp.json**) or your project-specific file (**.cursor/mcp.json**):

### Docker

```json
{
"mcpServers": {
"k6": {
"command": "docker",
"args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"]
}
}
}
```

### Native

```json
{
"mcpServers": {
"k6": {
"command": "mcp-k6"
}
}
}
```

Restart Cursor or reload MCP servers, then verify the connection by invoking the `info` tool from chat.

## Claude Code (CLI)

Add `mcp-k6` to Claude Code using the `claude mcp add` command.

### Docker

```sh
claude mcp add --scope=user --transport=stdio k6 docker run --rm -i grafana/mcp-k6:latest
```

### Native

```sh
claude mcp add --scope=user --transport=stdio k6 mcp-k6
```

Use `--scope=local` to add the configuration to the current project instead of globally.

Reload the workspace after adding the server.

## Codex

Codex CLI supports MCP servers over stdio.

1. Locate your Codex configuration file.
If you are unsure of the location, run codex help config to find the file path.
1. Add the MCP server configuration under the `mcpServers` key.

### Docker

```json
{
"mcpServers": {
"k6": {
"command": "docker",
"args": ["run", "--rm", "-i", "grafana/mcp-k6:latest"]
}
}
}
```

### Native

```json
{
"mcpServers": {
"k6": {
"command": "mcp-k6"
}
}
}
```

Restart Codex or reload its configuration to activate the server.

## Other MCP clients

If your MCP client is not in the previous list, you can use mcp-k6 with any client that supports stdio-based MCP servers.

### How MCP works

The Model Context Protocol (MCP) is a standard way for AI assistants to communicate with external tools. At its core, an MCP server is a program that:

1. Reads JSON-RPC messages from **stdin**.
1. Writes JSON-RPC responses to **stdout**.
1. Advertises available tools, resources, and prompts that the AI can use.

When you configure an MCP client, you define a specific command, the MCP server binary, for the client to launch and communicate with over stdio. The client then sends requests to list tools or execute functions, and the server returns the results.

### What you need to configure

Most MCP clients require the following information:

| Field | Description |
|-------|-------------|
| **name** | An identifier for the server (for example, `k6`). |
| **command** | The program to run. For native installs, this is `mcp-k6`. For Docker, this is `docker`. |
| **args** | Command-line arguments. For Docker, include `run`, `--rm`, `-i`, and the image name. |
| **env** | Optional environment variables to pass to the server. |
| **transport** | Usually `stdio` (some clients assume this by default). |

Consult your client's documentation for the exact configuration file location and format. Look for sections about "MCP servers", "tool servers", or "stdio servers".
Loading
Loading