Skip to content

Commit af2e177

Browse files
committed
restructure
1 parent 9e33d5e commit af2e177

File tree

1 file changed

+37
-36
lines changed

1 file changed

+37
-36
lines changed

src/content/docs/agents/examples/build-mcp-server.mdx

Lines changed: 37 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -13,44 +13,11 @@ import { Aside } from '@astrojs/starlight/components';
1313

1414
[Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) is an open standard that allows AI agents and assistants (like [Claude Desktop](https://claude.ai/download) or [Cursor](https://www.cursor.com/)) to interact with services directly. If you want users to access your service through an AI assistant, you can spin up an MCP server for your application.
1515

16-
### Why use Cloudflare Workers for MCP?
16+
### Building an MCP Server on Workers
1717

1818
With Cloudflare Workers and the [workers-mcp](https://github.com/cloudflare/workers-mcp/) package, you can turn any API or service into an MCP server with minimal setup. Just define your API methods as TypeScript functions, and workers-mcp takes care of tool discovery, protocol handling, and request routing. Once deployed, MCP clients like Claude can connect and interact with your service automatically.
1919

20-
#### Example: Exposing a Weather API as an MCP server
21-
22-
Here’s a Cloudflare Worker that fetches weather data from an external API and exposes it as an MCP tool that Claude can call directly:
23-
24-
```ts
25-
import { WorkerEntrypoint } from 'cloudflare:workers';
26-
import { ProxyToSelf } from 'workers-mcp';
27-
28-
export default class WeatherWorker extends WorkerEntrypoint<Env> {
29-
/**
30-
* Get current weather for a location
31-
* @param location {string} City name or zip code
32-
* @return {object} Weather information
33-
*/
34-
async getWeather(location: string) {
35-
// Connect to a weather API
36-
const response = await fetch(`https://api.weather.example/v1/${location}`);
37-
const data = await response.json();
38-
return {
39-
temperature: data.temp,
40-
conditions: data.conditions,
41-
forecast: data.forecast
42-
};
43-
}
44-
45-
/**
46-
* @ignore
47-
*/
48-
async fetch(request: Request): Promise<Response> {
49-
// ProxyToSelf handles MCP protocol compliance
50-
return new ProxyToSelf(this).fetch(request);
51-
}
52-
}
53-
```
20+
Below we will run you through an example of building an MCP server that fetches weather data from an external API and exposes it as an MCP tool that Claude can call directly:
5421

5522
**How it works:**
5623
* **TypeScript methods as MCP tools:** Each public method in your class is exposed as an MCP tool that agents can call. In this example, getWeather is the tool that fetches data from an external weather API.
@@ -138,11 +105,45 @@ This converts your Cloudflare Worker into an MCP server, enabling interactions w
138105

139106
**Note:** Every public method that is annotated with JSDoc becomes an MCP tool that is discoverable by AI assistants.
140107

108+
### Add data fetching to the MCP
109+
110+
When building an MCP, in many cases, you will need to connect to a resource or an API to take an action. To do this you can use the `fetch` method to make direct API calls, such as in the example below for grabbing the current wearther:
111+
112+
```ts
113+
import { WorkerEntrypoint } from 'cloudflare:workers';
114+
import { ProxyToSelf } from 'workers-mcp';
115+
116+
export default class WeatherWorker extends WorkerEntrypoint<Env> {
117+
/**
118+
* Get current weather for a location
119+
* @param location {string} City name or zip code
120+
* @return {object} Weather information
121+
*/
122+
async getWeather(location: string) {
123+
// Connect to a weather API
124+
const response = await fetch(`https://api.weather.example/v1/${location}`);
125+
const data = await response.json();
126+
return {
127+
temperature: data.temp,
128+
conditions: data.conditions,
129+
forecast: data.forecast
130+
};
131+
}
132+
133+
/**
134+
* @ignore
135+
*/
136+
async fetch(request: Request): Promise<Response> {
137+
// ProxyToSelf handles MCP protocol compliance
138+
return new ProxyToSelf(this).fetch(request);
139+
}
140+
}
141+
```
141142

142143
### Deploy the MCP server
143144
Update your wrangler.toml with the appropriate configuration then deploy your Worker:
144145
```bash
145-
wrangler deploy
146+
npx wrangler deploy
146147
```
147148

148149
Your MCP server is now deployed globally and all your public class methods are exposed as MCP tools that AI assistants can now interact with.

0 commit comments

Comments
 (0)