A Model Context Protocol (MCP) server that provides seamless integration with OpenAI's GPT-5 API. Access the most advanced language model directly through your favorite AI development tools.
- Direct GPT-5 Integration: Access OpenAI's most advanced model through MCP
- Dual Tool Support:
gpt5_generate
: Simple text generation with promptsgpt5_messages
: Structured conversation handling
- Flexible Parameters: Control temperature, max tokens, reasoning effort, and more
- Usage Tracking: Built-in token usage reporting
- TypeScript: Fully typed for better development experience
- Node.js (v18 or higher)
- OpenAI API key with GPT-5 access
Option 1: Install from NPM (Recommended)
npm install -g @dannyboy2042/gpt5-mcp-server
Option 2: Build from Source
git clone https://github.com/danielbowne/gpt5-mcp.git
cd gpt5-mcp/servers/gpt5-server
npm install
npm run build
Set your OpenAI API key:
export OPENAI_API_KEY=your-api-key-here
Add to your claude_desktop_config.json
:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"gpt5-server": {
"command": "npx",
"args": ["@dannyboy2042/gpt5-mcp-server"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
Alternative: Using local build
{
"mcpServers": {
"gpt5-server": {
"command": "node",
"args": ["/path/to/gpt5-mcp/servers/gpt5-server/build/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
Add to your Cursor settings:
{
"mcpServers": {
"gpt5-server": {
"command": "npx",
"args": ["@dannyboy2042/gpt5-mcp-server"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
Add to your Windsurf configuration file:
macOS: ~/Library/Application Support/Windsurf/config.json
Windows: %APPDATA%\Windsurf\config.json
{
"mcpServers": {
"gpt5-server": {
"command": "npx",
"args": ["@dannyboy2042/gpt5-mcp-server"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
Add to your Continue configuration:
{
"models": [
{
"model": "AUTODETECT",
"provider": "mcp",
"apiKey": "",
"mcpServers": [
{
"command": "npx",
"args": ["@dannyboy2042/gpt5-mcp-server"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
]
}
]
}
Local Server Connection (Recommended)
claude mcp add gpt5-server -- npx -y @dannyboy2042/gpt5-mcp-server
With Environment Variable
claude mcp add gpt5-server -e OPENAI_API_KEY=your-api-key-here -- npx -y @dannyboy2042/gpt5-mcp-server
Legacy Method
claude mcp add gpt5-server -e OPENAI_API_KEY=your-api-key-here -- npx @dannyboy2042/gpt5-mcp-server
For any MCP-compatible client, use:
{
"command": "npx",
"args": ["@dannyboy2042/gpt5-mcp-server"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
Generate text using a simple input prompt.
Parameters:
input
(required): The text prompt for GPT-5model
(optional): GPT-5 model variant (default: "gpt-5")instructions
(optional): System instructions for the modelreasoning_effort
(optional): Reasoning level ("low", "medium", "high")max_tokens
(optional): Maximum tokens to generatetemperature
(optional): Randomness level (0-2)top_p
(optional): Top-p sampling parameter (0-1)
Example:
{
"input": "Explain quantum computing in simple terms",
"reasoning_effort": "high",
"max_tokens": 500
}
Generate text using structured conversation messages.
Parameters:
messages
(required): Array of conversation messagesrole
: "user", "developer", or "assistant"content
: Message text
model
(optional): GPT-5 model variant (default: "gpt-5")instructions
(optional): System instructionsreasoning_effort
(optional): Reasoning levelmax_tokens
(optional): Maximum tokenstemperature
(optional): Randomness (0-2)top_p
(optional): Top-p sampling (0-1)
Example:
{
"messages": [
{"role": "user", "content": "What is the capital of France?"},
{"role": "assistant", "content": "The capital of France is Paris."},
{"role": "user", "content": "What about Germany?"}
],
"instructions": "Be concise and informative",
"reasoning_effort": "medium"
}
Use GPT-5 to explain the theory of relativity
Ask GPT-5 to review this code and suggest improvements
Have GPT-5 write a short story about time travel
Use GPT-5 with high reasoning to solve this complex algorithm problem
Create a .env
file in the servers
directory:
# Required
OPENAI_API_KEY=your-openai-api-key-here
# Optional (for development)
DEBUG=true
# Install dependencies
npm install
# Build the server
npm run build
# Run in development mode
npm run dev
gpt5-mcp/
├── servers/
│ └── gpt5-server/
│ ├── src/
│ │ ├── index.ts # Main server
│ │ └── utils.ts # API utilities
│ ├── build/ # Compiled output
│ ├── package.json
│ └── tsconfig.json
└── README.md
- Verify your OpenAI API key is valid
- Check that you have GPT-5 access on your account
- Ensure Node.js v18+ is installed
# Clean rebuild
rm -rf build/
npm install
npm run build
- Check your API key has sufficient credits
- Verify GPT-5 model access
- Review rate limits on your OpenAI account
- API keys are never hardcoded
- Environment variables for sensitive data
- Secure HTTPS communication with OpenAI
- Error messages don't expose sensitive information
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
MIT License - see LICENSE file for details
- Built with Model Context Protocol
- Powered by OpenAI GPT-5
- TypeScript SDK by @modelcontextprotocol
- Issues: GitHub Issues
- Discussions: GitHub Discussions
💡 Inspired by: All About AI - Thanks for the awesome content that sparked this project!
"The future is already here — it's just not evenly distributed." - William Gibson
⭐ Star this repo if you find it useful!