An MCP (Model Context Protocol) server that provides tools for enterprise-scale AI assistance across indexed repositories and documentation. This server enables AI agents to efficiently search and retrieve code context for development tasks.
- Search: Natural language queries across indexed repos/docs, tailored to a provided technology stack
-
Clone the repository:
git clone https://github.com/bitovi/enterprise-ai-mcp.git cd enterprise-ai-mcp -
Install dependencies:
npm install
-
Build the server:
npm run build
-
Run the server:
npm run start
This server requires a webhook endpoint for search. Create a .env file (see .env.example) and set:
WEBHOOK_URL=https://your-webhook-endpoint
To use this MCP server with an MCP client (e.g., VS Code + Cline), add a server entry in your client’s MCP settings file (path varies by client and OS):
{
"mcpServers": {
"enterpriseCode": {
"command": "node",
"args": ["/path/to/enterprise-ai-mcp/build/index.js"]
}
}
}The server will be loaded automatically by the MCP client.
The repository includes four prompting files to guide AI interactions and development. These files all include a statement that the agent must read back to you so that you may confirm that the agent has used the file.
AGENTS.md.github/copilot-instructions.md.github/prompts/sampleSearch.prompt.md.github/instructions/sample.instructions.md
Purpose: Given a natural-language query and technology stack, return the best matches across all indexed repos/docs. Results are provided by the configured webhook.
Inputs:
Message(string, required): A detailed natural-language description of the user’s query, task, or intent.Stack(string, required): Comma-separated list of relevant technologies (e.g.,TypeScript, React, Node.js).
Returns: JSON payload returned by the webhook (passed through as text).
- Source code:
src/index.ts - Build output:
build/index.js - The server forwards tool requests to
WEBHOOK_URLand returns the JSON response.