This project is a demonstration MCP (Machine Context Protocol) server that interacts with the Paychex Developer API documentation. It allows integration with various LLMs (Language Learning Models) to process queries against Paychex documentation.
- Connect to Paychex Developer API documentation
- Support for multiple LLM providers (OpenAI, Anthropic, etc.)
- Environment-based configuration for secure key management
- RESTful API endpoints for querying Paychex data with LLMs
- Clone this repository
- Install dependencies
npm install - Create a
.envfile based on.env.examplecp .env.example .env - Edit the
.envfile with your API keys
This project uses environment variables for configuration. Create a .env file in the root directory with the following variables:
# Server configuration
PORT=3000
# Paychex API keys
PAYCHEX_API_KEY=your_api_key_here
PAYCHEX_CLIENT_ID=your_client_id_here
PAYCHEX_CLIENT_SECRET=your_client_secret_here
# LLM configuration
LLM_TYPE=openai # Options: openai, anthropic, azure, etc.
LLM_API_KEY=your_llm_api_key_here
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
npm run dev
npm start
GET /api/paychex/data?endpoint=optional_endpoint_path
Returns raw data from the Paychex API documentation.
POST /api/paychex/query
Body:
{
"query": "What are the available Paychex APIs?",
"endpoint": "optional_endpoint_path"
}Processes the query against Paychex documentation using the configured LLM.
To add support for a new LLM provider:
- Add a new service class in
services/llm.js - Update the
getLLMService()function to handle the new provider - Add the necessary environment variables to
.env.example
This is a demonstration project only. It is not officially affiliated with or endorsed by Paychex.