Skip to content

Commit 0b82ac4

Browse files
committed
Sketch out MCP
1 parent 75eeed1 commit 0b82ac4

File tree

8 files changed

+15574
-263
lines changed

8 files changed

+15574
-263
lines changed

README.md

Lines changed: 41 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -80,21 +80,52 @@ steps:
8080
cat "${{ steps.inference.outputs.response-file }}"
8181
```
8282
83+
### GitHub MCP Integration (Model Context Protocol)
84+
85+
This action now supports integration with the GitHub-hosted Model Context
86+
Protocol (MCP) server, which provides access to GitHub tools like repository
87+
management, issue tracking, and pull request operations.
88+
89+
```yaml
90+
steps:
91+
- name: AI Inference with GitHub Tools
92+
id: inference
93+
uses: actions/ai-inference@v1
94+
with:
95+
prompt: 'List my open pull requests and create a summary'
96+
enable-mcp: true
97+
mcp-server-url: 'https://github-mcp-server.fly.dev/mcp' # Optional, this is the default
98+
```
99+
100+
When MCP is enabled, the AI model will have access to GitHub tools and can
101+
perform actions like:
102+
103+
- Listing and managing repositories
104+
- Creating, reading, and updating issues
105+
- Managing pull requests
106+
- Searching code and repositories
107+
- And more GitHub operations
108+
109+
**Note:** MCP integration requires appropriate GitHub permissions for the
110+
operations the AI will perform.
111+
83112
## Inputs
84113
85114
Various inputs are defined in [`action.yml`](action.yml) to let you configure
86115
the action:
87116

88-
| Name | Description | Default |
89-
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------ |
90-
| `token` | Token to use for inference. Typically the GITHUB_TOKEN secret | `github.token` |
91-
| `prompt` | The prompt to send to the model | N/A |
92-
| `prompt-file` | Path to a file containing the prompt. If both `prompt` and `prompt-file` are provided, `prompt-file` takes precedence | `""` |
93-
| `system-prompt` | The system prompt to send to the model | `"You are a helpful assistant"` |
94-
| `system-prompt-file` | Path to a file containing the system prompt. If both `system-prompt` and `system-prompt-file` are provided, `system-prompt-file` takes precedence | `""` |
95-
| `model` | The model to use for inference. Must be available in the [GitHub Models](https://github.com/marketplace?type=models) catalog | `gpt-4o` |
96-
| `endpoint` | The endpoint to use for inference. If you're running this as part of an org, you should probably use the org-specific Models endpoint | `https://models.github.ai/inference` |
97-
| `max-tokens` | The max number of tokens to generate | 200 |
117+
| Name | Description | Default |
118+
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------- |
119+
| `token` | Token to use for inference. Typically the GITHUB_TOKEN secret | `github.token` |
120+
| `prompt` | The prompt to send to the model | N/A |
121+
| `prompt-file` | Path to a file containing the prompt. If both `prompt` and `prompt-file` are provided, `prompt-file` takes precedence | `""` |
122+
| `system-prompt` | The system prompt to send to the model | `"You are a helpful assistant"` |
123+
| `system-prompt-file` | Path to a file containing the system prompt. If both `system-prompt` and `system-prompt-file` are provided, `system-prompt-file` takes precedence | `""` |
124+
| `model` | The model to use for inference. Must be available in the [GitHub Models](https://github.com/marketplace?type=models) catalog | `gpt-4o` |
125+
| `endpoint` | The endpoint to use for inference. If you're running this as part of an org, you should probably use the org-specific Models endpoint | `https://models.github.ai/inference` |
126+
| `max-tokens` | The max number of tokens to generate | 200 |
127+
| `enable-mcp` | Enable Model Context Protocol integration with GitHub tools | `false` |
128+
| `mcp-server-url` | URL of the MCP server to connect to for GitHub tools | `https://github-mcp-server.fly.dev/mcp` |
98129

99130
## Outputs
100131

action.yml

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,14 @@ inputs:
4141
description: The token to use
4242
required: false
4343
default: ${{ github.token }}
44+
enable-mcp:
45+
description: Enable Model Context Protocol integration with GitHub tools
46+
required: false
47+
default: 'false'
48+
mcp-server-url:
49+
description: URL of the MCP server to connect to
50+
required: false
51+
default: 'https://github-mcp-server.fly.dev/mcp'
4452

4553
# Define your outputs here.
4654
outputs:

0 commit comments

Comments
 (0)