Skip to content

Latest commit

 

History

History
162 lines (124 loc) · 4.09 KB

File metadata and controls

162 lines (124 loc) · 4.09 KB

OpenRouter MCP Server for Claude Code

This MCP server bridges Claude Code to OpenRouter, allowing you to query other AI models (GPT-5.2-Codex, Gemini, DeepSeek, Llama, etc.) without leaving your Claude session.

Prerequisites

Before installing, you need:

Requirement Description Get it at
Claude Code CLI Requires Max Pro subscription OR Anthropic API key claude.ai/code
Node.js 18+ JavaScript runtime nodejs.org
OpenRouter account With credits for API usage openrouter.ai
OpenRouter API key For authenticating requests openrouter.ai/keys

Quick Start

# Clone and setup
git clone <this-repo>
cd Claude_GPT_MCP
./setup.sh

The setup script will:

  1. Check Node.js version
  2. Install dependencies
  3. Build the TypeScript
  4. Prompt for your OpenRouter API key
  5. Show the Claude Code configuration

Key Management

./setup.sh              # Fresh install (build + prompt for key)
./setup.sh --set-key    # Add or change API key
./setup.sh --show-key   # Show current key (masked: sk-or-v1...xxxx)
./setup.sh --remove-key # Remove stored API key
./setup.sh --help       # Show help

Claude Code Configuration

Add to ~/.claude.json:

{
  "mcpServers": {
    "openrouter": {
      "command": "node",
      "args": ["/path/to/Claude_GPT_MCP/dist/index.js"]
    }
  }
}

Then restart Claude Code: claude --mcp-debug

Available MCP Tools

Once configured, Claude has access to these tools:

ask_model

Query any OpenRouter model:

"Ask GPT-5.2-Codex to review this function for edge cases"
"Get Gemini's opinion on this architecture"
"Ask DeepSeek for an alternative implementation"

list_models

List available models with pricing:

"What models are available on OpenRouter?"
"List models that match 'llama'"

set_default_model

Set your preferred default model:

"Set my default model to GPT-5.2-Codex"

get_config

View current configuration:

"Show my OpenRouter config"

add_shortcut

Create custom model shortcuts:

"Add a shortcut 'fast' for openai/gpt-4o-mini"

add_favorite / remove_favorite

Manage your favorites list:

"Add claude-3-opus to my favorites"

Slash Commands

If you copy .claude/commands/ to your project, you get:

Command Description
/models List available OpenRouter models
/models gpt Filter models by name
/model Show current config
/model GPT-5.2-Codex Set default model

Built-in Shortcuts

Shortcut OpenRouter Model ID
GPT-5.2-Codex openai/gpt-5.2-codex
gpt-4-turbo openai/gpt-4-turbo
claude-3-opus anthropic/claude-3-opus
claude-3-sonnet anthropic/claude-3-sonnet
gemini-pro google/gemini-pro
deepseek-chat deepseek/deepseek-chat
llama-3-70b meta-llama/llama-3-70b-instruct

You can also use any full OpenRouter model ID directly.

Configuration Files

File Purpose
~/.config/openrouter-mcp/config.json User preferences (default model, favorites, shortcuts)
~/.bashrc or ~/.zshrc API key environment variable

Example Workflow

You: "I'm not sure if this caching strategy is optimal. Ask GPT-5.2-Codex for a second opinion."

Claude: [Uses ask_model tool with your code as context]

Claude: "GPT-5.2-Codex suggests using an LRU cache instead of TTL-based expiration because..."

Troubleshooting

"OPENROUTER_API_KEY not set"

./setup.sh --show-key  # Check if key is set
./setup.sh --set-key   # Set/update the key
source ~/.bashrc       # Reload shell config

MCP server not loading

claude --mcp-debug  # See detailed MCP loading logs

Model not found

Use list_models to see available models, or check openrouter.ai/models.

Rate limits

OpenRouter has rate limits per model. If you hit them, wait or try a different model.