Skip to content

Razer AIKit V2 - Unified Local LLM Platform with MCP Bridge Integration for Visionary Tool Server

Notifications You must be signed in to change notification settings

VisionaryArchitects/razer-aikit-v2

Repository files navigation

πŸš€ Visionary Tool Server

Production-grade MCP server exposing 331 enterprise tools across 64 modules for AI-powered automation

Build Status Version Python Tools License FastMCP


✨ Key Features

  • πŸ”§ 331 Production-Ready Tools - Comprehensive integrations across development, communication, AI, cloud, and productivity platforms
  • ⚑ High-Performance SSE Transport - Built on FastMCP 2.14.2 + Starlette 0.50.0 with Server-Sent Events for real-time streaming
  • πŸ€– Multi-Client Support - Compatible with Claude Desktop, ChatGPT, Gemini, AnythingLLM, Qwen CLI, and Codex CLI
  • πŸ” Enterprise Security - Environment-based credential management with OAuth 2.0 support and secure API key handling
  • πŸ’ͺ Optimized for Power Users - Razer Blade 16 RTX 4090 configuration with 96GB RAM for demanding AI workloads

🎯 Quick Start

Prerequisites

  • Python 3.14.2+ (tested on 3.14.2)
  • Windows 11 (primary), macOS, or Linux
  • Node.js 18+ (for some client integrations)
  • Git for cloning repositories
  • 16GB+ RAM recommended (64GB+ for optimal performance)

Installation

# 1. Clone the repository
git clone https://github.com/yourusername/visionary-tool-server.git
cd visionary-tool-server

# 2. Create virtual environment
python -m venv venv

# Windows
venv\Scripts\activate

# macOS/Linux
source venv/bin/activate

# 3. Install dependencies
pip install -r requirements.txt

# 4. Copy environment template
cp .env.example .env

# 5. Configure your API keys (see Configuration section)
notepad .env  # Windows
nano .env     # macOS/Linux

Configuration

Create a .env file in the project root with your API keys:

# AI/LLM Services
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GEMINI_API_KEY=AI...
GROQ_API_KEY=gsk_...
OPENROUTER_API_KEY=sk-or-...
DEEPSEEK_API_KEY=sk-...
MISTRAL_API_KEY=...
COHERE_API_KEY=...
TOGETHER_API_KEY=...
FIREWORKS_API_KEY=...
PERPLEXITY_API_KEY=...
REPLICATE_API_TOKEN=r8_...
FAL_API_KEY=...
STABILITY_AI_KEY=sk-...
ELEVENLABS_API_KEY=...
HUGGINGFACE_API_KEY=hf_...

# Local LLM (Razer AIKit)
RAZER_AIKIT_BASE_URL=http://localhost:1234/v1
RAZER_AIKIT_API_KEY=lm-studio

# Chinese AI Models
MINIMAX_API_KEY=...
MINIMAX_GROUP_ID=...
GROK_API_KEY=xai-...
NEMOTRON_API_KEY=nvapi-...

# Cloud Services
AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION=us-east-1
AZURE_OPENAI_API_KEY=...
AZURE_OPENAI_ENDPOINT=https://...
CLOUDFLARE_API_TOKEN=...
VERCEL_TOKEN=...
RAILWAY_API_KEY=...

# Development Tools
GITHUB_TOKEN=ghp_...
GITLAB_TOKEN=glpat-...
JIRA_API_TOKEN=...
LINEAR_API_KEY=lin_...
CLICKUP_API_KEY=...
ASANA_ACCESS_TOKEN=...
MONDAY_API_KEY=...

# Communication
DISCORD_BOT_TOKEN=...
SLACK_BOT_TOKEN=xoxb-...
TWILIO_ACCOUNT_SID=AC...
TWILIO_AUTH_TOKEN=...
SENDGRID_API_KEY=SG...
RESEND_API_KEY=re_...

# Databases & Storage
SUPABASE_URL=https://...
SUPABASE_KEY=eyJ...
FIREBASE_PROJECT_ID=...
PINECONE_API_KEY=...
QDRANT_URL=http://localhost:6333
REDIS_URL=redis://localhost:6379

# Monitoring & Observability
DATADOG_API_KEY=...
SENTRY_DSN=https://...
GRAFANA_API_KEY=...
PROMETHEUS_URL=http://localhost:9090

# Productivity
NOTION_TOKEN=secret_...
OBSIDIAN_VAULT_PATH=C:\Users\...\ObsidianVault
AIRTABLE_API_KEY=...
ZENDESK_API_TOKEN=...

# Automation
N8N_API_URL=http://localhost:5678
N8N_API_KEY=...
WEBHOOK_SECRET=...

# Search & Data
BRAVE_API_KEY=BSA...
EXA_API_KEY=...
FIRECRAWL_API_KEY=...

Running the Server

# Start the MCP server on port 8082
python -m visionary_tool_server

# Or with custom port
python -m visionary_tool_server --port 8090

# With debug logging
python -m visionary_tool_server --debug

# With ngrok tunnel for remote access
ngrok http 8082

Verify Installation

# Check health endpoint
curl http://localhost:8082/health

# Expected response:
# {"status": "healthy", "version": "4.0.0", "tools": 331, "modules": 64}

# Test SSE endpoint
curl http://localhost:8082/sse

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                          MCP Clients                                 β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”           β”‚
β”‚  β”‚  Claude  β”‚  β”‚ ChatGPT  β”‚  β”‚  Gemini  β”‚  β”‚AnythingLLMβ”‚   ...     β”‚
β”‚  β”‚ Desktop  β”‚  β”‚Desktop/iOSβ”‚  β”‚  Mobile  β”‚  β”‚  Server  β”‚           β”‚
β”‚  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜           β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
        β”‚             β”‚             β”‚             β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚      SSE Transport (Port 8082)                 β”‚
        β”‚  http://localhost:8082/sse                     β”‚
        β”‚  https://visionary-tool-server.ngrok.io/sse    β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚         FastMCP Server v2.14.2                 β”‚
        β”‚    Starlette 0.50.0 + uvicorn 0.40.0          β”‚
        β”‚         Python 3.14.2 Runtime                  β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚      Tool Module Registry (64 modules)         β”‚
        β”‚                                                β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Development (50 tools)                  β”‚ β”‚
        β”‚  β”‚  GitHub β€’ GitLab β€’ Jira β€’ Linear         β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  AI/LLM (88 tools)                       β”‚ β”‚
        β”‚  β”‚  OpenAI β€’ Anthropic β€’ Gemini β€’ Local    β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Communication (46 tools)                β”‚ β”‚
        β”‚  β”‚  Discord β€’ Slack β€’ Twilio β€’ SendGrid    β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Cloud/Infra (42 tools)                  β”‚ β”‚
        β”‚  β”‚  AWS β€’ Vercel β€’ Railway β€’ Cloudflare    β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Productivity (40 tools)                 β”‚ β”‚
        β”‚  β”‚  Notion β€’ Obsidian β€’ ClickUp β€’ Airtable β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Data/Vector DBs (29 tools)              β”‚ β”‚
        β”‚  β”‚  Pinecone β€’ Qdrant β€’ Redis β€’ Supabase   β”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
        β”‚  β”‚  Monitoring (36 tools)                   β”‚ β”‚
        β”‚  β”‚  Datadog β€’ Sentry β€’ Grafana β€’ Prometheusβ”‚ β”‚
        β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚           External APIs                        β”‚
        β”‚  REST β€’ GraphQL β€’ WebSockets β€’ gRPC           β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

        Hardware: Razer Blade 16 | RTX 4090 | 96GB RAM
        OS: Windows 11 Pro | Transport: SSE | Port: 8082

πŸ“Š Tool Inventory

Complete Tool Catalog (331 tools across 64 modules)

Development & Version Control (72 tools)

Module Tools Description
GitHub 37 Repository management, PR workflows, issues, actions, releases, gists
GitLab 26 Project operations, merge requests, CI/CD pipelines, container registry
Shell 9 System command execution, process management, environment control

AI & Large Language Models (88 tools)

Module Tools Description
Razer AIKit (Local) 20 Local LLM inference, model management, embeddings, fine-tuning
MiniMax M2.1 8 Chinese multimodal AI, text generation, image understanding
OpenAI 7 GPT-4/3.5 completions, embeddings, DALL-E, Whisper, moderation
Anthropic 3 Claude 3.5 Sonnet/Opus, prompt caching, vision
Gemini 4 Google Gemini Pro/Ultra, multimodal reasoning, Gemma
Groq 3 Ultra-fast LLM inference, Mixtral, LLaMA3
OpenRouter 3 Multi-provider routing, cost optimization, fallback
DeepSeek 3 Chinese open-source models, code generation
Mistral 3 Mistral Large/Medium, function calling, embeddings
Cohere 4 Command R+, embeddings, rerank, classification
Together AI 3 Open-source model hosting, fine-tuning, inference
Fireworks 2 Fast inference, function calling, streaming
Perplexity 3 Search-augmented generation, citations, real-time data
Grok 3 xAI's Grok models, real-time X/Twitter integration
Nemotron 4 NVIDIA's instruction-tuned models, reasoning
AWS Bedrock 4 Claude/Titan/Jurassic on AWS, managed inference
Azure OpenAI 2 GPT-4/3.5 on Azure, enterprise compliance
NVIDIA 2 NIM microservices, optimized inference
HuggingFace 5 Model hub, inference API, datasets, spaces
Replicate 3 Image/video models, Stable Diffusion, Flux
Stability AI 2 Stable Diffusion 3, image-to-image, upscaling
FAL 3 Fast AI latency, real-time generation, video
ElevenLabs 3 Text-to-speech, voice cloning, multilingual

Communication & Collaboration (46 tools)

Module Tools Description
Discord 21 Bot commands, message management, server admin, webhooks
Slack 5 Message posting, channel management, user operations
Twilio 4 SMS/MMS, voice calls, WhatsApp, phone numbers
SendGrid 3 Transactional email, templates, analytics
Resend 2 Developer-first email API, React templates
Zendesk 4 Ticket management, customer support, knowledge base
Webhooks 2 HTTP callbacks, event notifications, integrations
OAuth 2 OAuth 2.0 flows, token management, authorization
n8n 3 Workflow automation, node creation, execution

Cloud Infrastructure & Deployment (42 tools)

Module Tools Description
Cloudflare 6 CDN, DNS, Workers, KV storage, R2, DDoS protection
Vercel 6 Deployment, domains, environment vars, analytics
Railway 5 Container deployment, databases, services, logs
AWS Bedrock 4 Managed AI services (see AI section)
Firebase 4 Authentication, Firestore, Cloud Functions, hosting
Supabase 4 PostgreSQL database, Auth, Storage, Edge Functions
Azure OpenAI 2 Azure-hosted AI services (see AI section)
Datadog 4 Infrastructure monitoring, APM, logs, metrics
Sentry 4 Error tracking, performance monitoring, releases
Grafana 3 Dashboards, alerting, visualization, data sources
Prometheus 3 Metrics collection, PromQL queries, alerting

Project Management & Productivity (40 tools)

Module Tools Description
Obsidian 8 Vault management, note CRUD, tags, search, linking
Notion 6 Database queries, page creation, blocks, workspaces
ClickUp 5 Tasks, lists, spaces, time tracking, goals
Jira 4 Issue tracking, sprints, boards, JQL queries
Linear 4 Issues, projects, cycles, roadmaps, integrations
Asana 4 Tasks, projects, teams, custom fields, portfolios
Monday 4 Boards, items, columns, automations, dashboards
Airtable 4 Records, bases, tables, views, formulas
Stripe 5 Payments, subscriptions, customers, invoices

Vector Databases & Data Stores (29 tools)

Module Tools Description
Pinecone 5 Vector search, index management, upsert, query, metadata
Qdrant 3 Vector similarity search, collections, filtering
Redis 3 Key-value store, caching, pub/sub, streams
Supabase 4 PostgreSQL with pgvector extension (see Cloud)
Firebase 4 Firestore document database (see Cloud)
Airtable 4 Spreadsheet-database hybrid (see Productivity)
Exa 4 Semantic search API, web crawling, entity extraction
Firecrawl 4 Web scraping, crawling, data extraction, monitoring

Search & Discovery (11 tools)

Module Tools Description
Brave Search 3 Privacy-focused search, Web Search API, goggles
Exa 4 Neural search, semantic similarity, web discovery
Firecrawl 4 Intelligent web crawling, scraping, monitoring

Orchestration & Automation (10 tools)

Module Tools Description
Agent Orchestrator 5 Multi-agent coordination, task delegation, state management
Orchestrator 4 Workflow orchestration, dependency management, scheduling
Self-test 1 Server health check, tool validation, diagnostics

πŸ”Œ Client Connection Guide

Claude Desktop

Configuration: claude_desktop_config.json (Windows location: %APPDATA%\Claude\)

{
  "mcpServers": {
    "visionary-tools": {
      "command": "http",
      "args": [],
      "env": {},
      "transport": {
        "type": "sse",
        "url": "http://localhost:8082/sse"
      }
    }
  }
}

Steps:

  1. Ensure server is running on port 8082
  2. Update Claude Desktop config
  3. Restart Claude Desktop
  4. Verify tools appear in the tools panel (331 tools)

ChatGPT Desktop & Mobile

Configuration: Built-in MCP support (ChatGPT Plus required)

Desktop:

  1. Open ChatGPT Desktop β†’ Settings β†’ Features β†’ Model Context Protocol
  2. Add server: http://localhost:8082/sse
  3. Enable SSE transport
  4. Restart ChatGPT

Mobile (iOS/Android):

  1. Open ChatGPT app β†’ Settings β†’ Advanced β†’ MCP Servers
  2. Add remote server: https://visionary-tool-server.ngrok.io/sse
  3. Requires ngrok tunnel for remote access

Google Gemini (Gemini App & Studio)

Configuration: Gemini 1.5 Pro with Extensions

Gemini App:

  1. Enable Developer Mode in Settings
  2. Add MCP server under Extensions
  3. URL: http://localhost:8082/sse (desktop) or ngrok URL (mobile)

Gemini AI Studio:

  1. Project Settings β†’ Integrations β†’ MCP
  2. Add server endpoint
  3. Configure OAuth if needed

AnythingLLM

Configuration: server-settings.json

{
  "mcp": {
    "enabled": true,
    "servers": [
      {
        "name": "Visionary Tools",
        "transport": "sse",
        "url": "http://localhost:8082/sse",
        "description": "331 enterprise tools"
      }
    ]
  }
}

Steps:

  1. AnythingLLM β†’ Admin Settings β†’ Tools β†’ MCP Servers
  2. Add new server with above configuration
  3. Test connection
  4. Tools available in all workspaces

Qwen CLI

Configuration: .qwenconfig

[mcp]
enabled = true

[[mcp.servers]]
name = "visionary"
url = "http://localhost:8082/sse"
transport = "sse"
tools = 331

Usage:

# Connect to server
qwen mcp connect visionary

# List available tools
qwen mcp tools

# Use a tool
qwen "Create a GitHub issue using visionary tools"

Codex CLI

Configuration: codex.yaml

mcp_servers:
  - name: visionary-tools
    url: http://localhost:8082/sse
    transport: sse
    auto_connect: true
    categories:
      - development
      - ai
      - communication
      - cloud
      - productivity

Usage:

# Auto-connects on start
codex chat

# Explicitly use tools
codex --tools visionary-tools "Deploy to Vercel"

ngrok Remote Access (for Mobile Clients)

# Install ngrok
choco install ngrok  # Windows
brew install ngrok   # macOS

# Authenticate
ngrok config add-authtoken YOUR_TOKEN

# Start tunnel
ngrok http 8082 --domain visionary-tool-server.ngrok.io

# Use this URL in mobile clients:
# https://visionary-tool-server.ngrok.io/sse

βš™οΈ Configuration

Environment Variables

Required Variables:

# At minimum, configure these for core functionality
GITHUB_TOKEN=ghp_...           # For GitHub tools
OPENAI_API_KEY=sk-...          # For OpenAI tools
ANTHROPIC_API_KEY=sk-ant-...   # For Claude tools

Optional but Recommended:

# Local LLM (no API costs)
RAZER_AIKIT_BASE_URL=http://localhost:1234/v1
RAZER_AIKIT_API_KEY=lm-studio

# Server configuration
MCP_SERVER_PORT=8082
MCP_LOG_LEVEL=INFO
MCP_CORS_ORIGINS=*
MCP_MAX_CONNECTIONS=100

# Performance tuning
UVICORN_WORKERS=4
UVICORN_TIMEOUT=300

Configuration File (config.yaml)

server:
  host: "0.0.0.0"
  port: 8082
  debug: false
  
fastmcp:
  version: "2.14.2"
  max_tools: 500
  
modules:
  enabled:
    - github
    - gitlab
    - discord
    - razer_aikit
    - openai
    # ... all 64 modules
  
  disabled: []
  
logging:
  level: INFO
  format: json
  file: logs/visionary-tool-server.log
  
security:
  rate_limit: 100  # requests per minute
  cors_enabled: true
  require_auth: false

πŸ› οΈ Development

Setup Development Environment

# Clone and setup
git clone https://github.com/yourusername/visionary-tool-server.git
cd visionary-tool-server

# Install dev dependencies
pip install -r requirements-dev.txt

# Install pre-commit hooks
pre-commit install

# Run tests
pytest

# Code coverage
pytest --cov=visionary_tool_server --cov-report=html

Project Structure

visionary-tool-server/
β”œβ”€β”€ visionary_tool_server/
β”‚   β”œβ”€β”€ __init__.py
β”‚   β”œβ”€β”€ __main__.py          # Entry point
β”‚   β”œβ”€β”€ server.py            # FastMCP server setup
β”‚   β”œβ”€β”€ config.py            # Configuration management
β”‚   β”œβ”€β”€ modules/             # Tool modules (64 files)
β”‚   β”‚   β”œβ”€β”€ github.py
β”‚   β”‚   β”œβ”€β”€ gitlab.py
β”‚   β”‚   β”œβ”€β”€ discord.py
β”‚   β”‚   β”œβ”€β”€ razer_aikit.py
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ utils/               # Utilities
β”‚   β”‚   β”œβ”€β”€ auth.py
β”‚   β”‚   β”œβ”€β”€ logging.py
β”‚   β”‚   └── validators.py
β”‚   └── types/               # Type definitions
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ test_modules/
β”‚   β”œβ”€β”€ test_server.py
β”‚   └── test_integration.py
β”œβ”€β”€ docs/
β”œβ”€β”€ logs/
β”œβ”€β”€ .env.example
β”œβ”€β”€ .gitignore
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ requirements-dev.txt
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ pytest.ini
└── README.md

Adding a New Module

# visionary_tool_server/modules/my_service.py
from fastmcp import FastMCP

def register_tools(mcp: FastMCP):
    """Register My Service tools with FastMCP server."""
    
    @mcp.tool()
    def my_service_action(param: str) -> dict:
        """
        Perform an action with My Service.
        
        Args:
            param: Input parameter
            
        Returns:
            Result dictionary
        """
        # Implementation
        return {"status": "success"}
    
    @mcp.tool()
    def my_service_query(query: str) -> list:
        """Query My Service data."""
        # Implementation
        return []

Register in server.py:

from visionary_tool_server.modules import my_service

# In setup function
my_service.register_tools(mcp)

Testing

# Run all tests
pytest

# Run specific test file
pytest tests/test_modules/test_github.py

# Run with coverage
pytest --cov --cov-report=html

# Run integration tests (requires API keys)
pytest tests/test_integration.py --integration

# Run performance tests
pytest tests/test_performance.py --benchmark

Linting & Formatting

# Format code
black visionary_tool_server/
isort visionary_tool_server/

# Lint
pylint visionary_tool_server/
flake8 visionary_tool_server/
mypy visionary_tool_server/

# Type checking
pyright visionary_tool_server/

# All checks (pre-commit)
pre-commit run --all-files

Building & Distribution

# Build package
python -m build

# Install locally
pip install -e .

# Create distribution
python setup.py sdist bdist_wheel

# Upload to PyPI (maintainers only)
twine upload dist/*

πŸ” Troubleshooting

Common Issues

Server Won't Start

Symptom: Port 8082 already in use

# Windows - Find and kill process
netstat -ano | findstr :8082
taskkill /PID <PID> /F

# macOS/Linux
lsof -ti:8082 | xargs kill -9

# Use different port
python -m visionary_tool_server --port 8090

Client Can't Connect

Symptom: Connection refused or ERR_CONNECTION_REFUSED

  1. Verify server is running: curl http://localhost:8082/health
  2. Check firewall: Allow port 8082
  3. Verify SSE endpoint: curl http://localhost:8082/sse
  4. Check client config syntax (JSON errors)
  5. Restart client application

Tools Not Appearing

Symptom: Client connects but shows 0 tools

  1. Check server logs: tail -f logs/visionary-tool-server.log
  2. Verify module imports: python -c "from visionary_tool_server.modules import github"
  3. Check environment variables: Missing API keys disable modules
  4. Restart server with debug: python -m visionary_tool_server --debug

API Key Errors

Symptom: 401 Unauthorized or Invalid API key

  1. Verify .env file exists and is loaded
  2. Check key format: Remove extra spaces/quotes
  3. Test key directly:
    curl -H "Authorization: Bearer $OPENAI_API_KEY" https://api.openai.com/v1/models
  4. Regenerate key if compromised

Slow Performance

Symptom: Tools timeout or respond slowly

  1. Check network latency to external APIs
  2. Increase timeouts in config
  3. Monitor resource usage: Task Manager / Activity Monitor
  4. Disable unused modules to reduce memory
  5. Use local LLM (Razer AIKit) for faster inference

Memory Issues

Symptom: MemoryError or system slowdown

  1. Reduce UVICORN_WORKERS (default 4 β†’ 2)
  2. Disable heavy modules (AI models, databases)
  3. Increase system swap space
  4. Monitor with: htop (Linux) or Task Manager (Windows)
  5. Restart server periodically for long-running instances

Debug Mode

# Enable debug logging
export MCP_LOG_LEVEL=DEBUG
python -m visionary_tool_server --debug

# Log all requests/responses
export MCP_LOG_REQUESTS=true

# Verbose FastMCP output
export FASTMCP_DEBUG=1

Health Check

# Basic health
curl http://localhost:8082/health

# Detailed diagnostics
curl http://localhost:8082/diagnostics

# Test specific module
curl -X POST http://localhost:8082/test/github

🀝 Contributing

We welcome contributions! Please follow these guidelines:

Getting Started

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/my-new-tool
  3. Make your changes with tests
  4. Run tests: pytest
  5. Commit: git commit -am 'Add new tool for X'
  6. Push: git push origin feature/my-new-tool
  7. Submit a Pull Request

Contribution Guidelines

  • Code Style: Follow PEP 8, use Black formatter
  • Tests: Maintain 80%+ coverage
  • Documentation: Update README and docstrings
  • Commits: Use conventional commit format
    • feat: new features
    • fix: bug fixes
    • docs: documentation
    • test: testing
    • refactor: code refactoring

Adding New Tools

  1. Create module in visionary_tool_server/modules/
  2. Implement tools with proper type hints and docstrings
  3. Add tests in tests/test_modules/
  4. Update this README with tool count and description
  5. Add required environment variables to .env.example

Reporting Issues

  • Use GitHub Issues
  • Include Python version, OS, error logs
  • Provide steps to reproduce
  • Check existing issues first

πŸ“„ License

MIT License

Copyright (c) 2024 Visionary Tool Server Contributors

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.


πŸ™ Credits & Acknowledgments

Core Technologies

  • FastMCP (2.14.2) - Modern Python MCP framework by Marvin AI
  • Starlette (0.50.0) - Lightning-fast ASGI framework
  • uvicorn (0.40.0) - ASGI server implementation
  • Pydantic - Data validation and settings management

Inspiration & Resources

Community

Special thanks to the MCP community and all contributors who helped build and test this server.

Hardware Partnership

Optimized for Razer systems:

  • Razer Blade 16 (2024)
  • NVIDIA RTX 4090 Laptop GPU
  • 96GB DDR5 RAM
  • Windows 11 Pro

πŸ“ž Support & Community


πŸ—ΊοΈ Roadmap

v4.1.0 (Q2 2024)

  • Add 50+ new tools (Notion AI, GitHub Copilot, etc.)
  • Implement tool chaining and workflows
  • Add request/response caching layer
  • Improve error handling and retries

v4.2.0 (Q3 2024)

  • GraphQL support alongside REST
  • WebSocket transport option
  • Built-in rate limiting per tool
  • Multi-tenancy support

v5.0.0 (Q4 2024)

  • 500+ tools target
  • GUI admin dashboard
  • Distributed deployment support
  • Enterprise SSO integration

Built with ❀️ for the AI automation community
Powered by FastMCP β€’ Optimized for Razer β€’ Open Source MIT

Quick Start β€’ Tools β€’ Clients β€’ Config β€’ Development

About

Razer AIKit V2 - Unified Local LLM Platform with MCP Bridge Integration for Visionary Tool Server

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors