diff --git a/README.md b/README.md index 8fae02ec..9d464b02 100644 --- a/README.md +++ b/README.md @@ -6,10 +6,12 @@ Implementation of an MCP server for all [Apify Actors](https://apify.com/store). This server enables interaction with one or more Apify Actors that can be defined in the MCP Server configuration. -The server can be used in several ways: +The server can be used in two ways: - **🇦 [MCP Server Actor](https://apify.com/apify/actors-mcp-server)** – HTTP server accessible via Server-Sent Events (SSE). - **⾕ MCP Server Stdio** – Local server available via standard input/output (stdio). -- **💬 [Tester MCP Client](https://apify.com/jiri.spilka/tester-mcp-client)** – Chat-like UI for interacting with the MCP server. + + +If can also interact with the MCP server using chat-like UI with 💬 [Tester MCP Client](https://apify.com/jiri.spilka/tester-mcp-client) # 🎯 What does Apify MCP server do? @@ -25,7 +27,7 @@ For example it can: To interact with the Apify MCP server, you can use MCP clients such as: - [Claude Desktop](https://claude.ai/download) (only Stdio support) -- [LibreChat](https://www.librechat.ai/) (stdio and SSE support (yeah without Authorization header)) +- [LibreChat](https://www.librechat.ai/) (stdio and SSE support (yet without Authorization header)) - [Apify Tester MCP Client](https://apify.com/jiri.spilka/tester-mcp-client) (SSE support with Authorization headers) - other clients at [https://modelcontextprotocol.io/clients](https://modelcontextprotocol.io/clients) - more clients at [https://glama.ai/mcp/clients](https://glama.ai/mcp/clients) @@ -43,8 +45,8 @@ The following image shows how the Apify MCP server interacts with the Apify plat ![Actors-MCP-server](https://raw.githubusercontent.com/apify/actors-mcp-server/refs/heads/master/docs/actors-mcp-server.png) -In the future, we plan to load Actors dynamically and provide Apify's dataset and key-value store as resources. -See the [Roadmap](#-roadmap-march-2025) for more details. +With the MCP Tester client you can load Actors dynamically but this is not yet supported by other MCP clients. +We also plan to add more features, see [Roadmap](#-roadmap-march-2025) for more details. # 🔄 What is the Model Context Protocol? @@ -129,9 +131,9 @@ https://actors-mcp-server.apify.actor?token= It is also possible to start the MCP server with a different set of Actors. To do this, create a [task](https://docs.apify.com/platform/actors/running/tasks) and specify the list of Actors you want to use. -Then, run task in Standby mode with the selected Actors using your Apify API token. +Then, run task in Standby mode with the selected Actors. ```shell -https://actors-mcp-server-task.apify.actor?token= +https://USERNAME--actors-mcp-server-task.apify.actor?token= ``` You can find a list of all available Actors in the [Apify Store](https://apify.com/store). @@ -141,9 +143,8 @@ You can find a list of all available Actors in the [Apify Store](https://apify.c Once the server is running, you can interact with Server-Sent Events (SSE) to send messages to the server and receive responses. The easiest way is to use [Tester MCP Client](https://apify.com/jiri.spilka/tester-mcp-client) on Apify. -Other clients do not support SSE yet, but this will likely change. -Please verify if MCP clients such as [Superinference.ai](https://superinterface.ai/) or [LibreChat](https://www.librechat.ai/) support SSE with custom headers. -([Claude Desktop](https://claude.ai/download) does not support SSE transport yet, see [Claude Desktop Configuration](#claude-desktop) section for more details). +Most of the MCP clients do not support SSE yet (as of March 2025), but this will likely change. +[Claude Desktop](https://claude.ai/download) does not support SEE yet, but you can use it with Stdio transport, see [MCP Sever at a local host](#-mcp-server-at-a-local-host) for more details. In the client settings you need to provide server configuration: ```json diff --git a/src/main.ts b/src/main.ts index 194aaa82..59e02fa8 100644 --- a/src/main.ts +++ b/src/main.ts @@ -33,6 +33,29 @@ let transport: SSEServerTransport; const HELP_MESSAGE = `Connect to the server with GET request to ${HOST}/sse?token=YOUR-APIFY-TOKEN` + ` and then send POST requests to ${HOST}/message?token=YOUR-APIFY-TOKEN`; +const actorRun = Actor.isAtHome() ? { + id: process.env.ACTOR_RUN_ID, + actId: process.env.ACTOR_ID, + userId: process.env.APIFY_USER_ID, + startedAt: process.env.ACTOR_STARTED_AT, + finishedAt: null, + status: 'RUNNING', + meta: { + origin: process.env.APIFY_META_ORIGIN, + }, + options: { + build: process.env.ACTOR_BUILD_NUMBER, + memoryMbytes: process.env.ACTOR_MEMORY_MBYTES, + }, + buildId: process.env.ACTOR_BUILD_ID, + defaultKeyValueStoreId: process.env.ACTOR_DEFAULT_KEY_VALUE_STORE_ID, + defaultDatasetId: process.env.ACTOR_DEFAULT_DATASET_ID, + defaultRequestQueueId: process.env.ACTOR_DEFAULT_REQUEST_QUEUE_ID, + buildNumber: process.env.ACTOR_BUILD_NUMBER, + containerUrl: process.env.ACTOR_WEB_SERVER_URL, + standbyUrl: process.env.ACTOR_STANDBY_URL, +} : {}; + /** * Process input parameters and update tools * If URL contains query parameter actors, add tools from actors, otherwise add tools from default actors @@ -63,7 +86,7 @@ app.route(Routes.ROOT) try { log.info(`Received GET message at: ${Routes.ROOT}`); await processParamsAndUpdateTools(req.url); - res.status(200).json({ message: `Actor is using Model Context Protocol. ${HELP_MESSAGE}` }).end(); + res.status(200).json({ message: `Actor is using Model Context Protocol. ${HELP_MESSAGE}`, data: actorRun }).end(); } catch (error) { log.error(`Error in GET ${Routes.ROOT} ${error}`); res.status(500).json({ message: 'Internal Server Error' }).end();