A web-based tool designed to help users construct high-quality, well-structured, and effective prompts for Large Language Models (LLMs) through a guided, step-by-step process with model-specific optimization.
The Master Prompt Creator transforms a user's basic idea into a detailed, optimized prompt using official best practices from OpenAI, Anthropic, and Google. It employs a wizard-like interface to ask clarifying questions based on established prompt engineering principles, then generates both a structured prompt and an AI-enhanced version optimized for the user's target LLM.
- Guided Prompt Construction: Multi-step questionnaire covering Role, Directive, Context, Constraints, Output Format, Tone, Examples, Creativity Level, and Error Handling
- Model-Specific Optimization: Tailored enhancement based on target LLM (GPT-4, Claude, Gemini)
- AI-Powered Example Generation: Gemini API generates relevant input-output pairs
- Automated Prompt Enhancement: Expert-level prompt optimization using model-specific best practices
- Quality Assessment: Real-time prompt scoring (0-100%) with improvement recommendations
- Official Resource Integration: Direct links to OpenAI, Anthropic, and Google prompting guides
- Model-Specific Techniques: Applies platform-specific optimization (XML tags for Claude, system messages for GPT-4, etc.)
- Quality Indicators: Visual scoring with green/yellow/red quality badges
- Side-by-Side Comparison: Raw vs. enhanced prompts with copy functionality
- Zero Dependencies: Single HTML file with no external frameworks
- Responsive Design: Modern UI with Tailwind CSS
-
Initial Task: User enters the core task they want the LLM to perform
-
Guided Questionnaire: 8-step wizard covering all essential prompt components:
- Directive (specific action/command)
- Role/Persona (expert identity)
- Context (background information)
- Tone & Audience (communication style)
- Output Format (structure requirements)
- Constraints (limitations/rules)
- Examples (few-shot demonstrations)
- Creativity Level (factual vs. creative approach)
- Error Handling (uncertainty management)
- Target LLM (optimization preference)
-
AI-Powered Assistance:
- Generate examples automatically using Gemini API
- Refine text with AI enhancement at any step
-
Quality Assessment: Real-time prompt scoring with specific recommendations
-
Model-Specific Enhancement:
- GPT-4: System messages, few-shot prompting, chain-of-thought
- Claude: XML structure, Constitutional AI principles
- Gemini: System instructions, structured output, multimodal support
-
Final Output: Side-by-side comparison of raw and enhanced prompts with quality indicators
- HTML: For the application structure.
- Tailwind CSS: For all styling (included via CDN).
- Vanilla JavaScript: For all application logic, state management, and interactivity.
- Google Gemini API: For AI-powered example generation and prompt enhancement.
- A modern web browser (e.g., Chrome, Firefox, Safari, Edge).
- A Google Gemini API key to enable the AI-powered features.
- Clone or download the
index.htmlfile. - Open the
index.htmlfile directly in your web browser. No web server is required.
This application is designed to be secure and does not come with a pre-loaded API key. To use the AI-powered features ("Generate Examples" and "AI-Enhanced Master Prompt"), you must provide your own Google Gemini API key.
- Obtain a key: If you don't have one, get a Gemini API key from Google AI Studio.
- Enter the key in the app: The first time you click a feature that requires the API, a pop-up window will appear asking for your key.
- Save the key: Paste your key into the input field and click "Save Key".
Your API key is saved securely in your browser's localStorage and is never shared or stored outside of your machine. You only need to do this once per browser.
- Master Prompt Generation Rules: Comprehensive rules and structure for generating high-quality prompts
- Official Best Practices Summary: Key principles from OpenAI, Anthropic, and Google documentation
- Official Resources: Direct links to all official prompting guides and examples
- Comprehensive Guidelines: Detailed prompting techniques and strategies
- Steering Rules: Active rules used by the application for prompt generation
The application automatically applies best practices from:
- OpenAI GPT-4: https://platform.openai.com/docs/guides/prompt-engineering
- Anthropic Claude: https://docs.anthropic.com/claude/docs/prompt-engineering
- Google Gemini: https://ai.google.dev/gemini-api/docs/prompting-strategies
- 80%+ Quality Score: Production-ready prompts with comprehensive components
- Model-Specific Techniques: Automatic application of platform-optimized structures
- Safety Guidelines: Built-in bias prevention and content safety measures
- Official Compliance: All techniques sourced from official documentation
Set Target LLM to "GPT-4" to automatically apply:
- System message structure
- Few-shot prompting with examples
- Chain-of-thought reasoning
- Temperature control guidance
Set Target LLM to "Claude" to automatically apply:
- XML tag structure (
<thinking>,<context>,<output>) - Constitutional AI principles
- Direct, explicit instructions
- Long context utilization
Set Target LLM to "Gemini" to automatically apply:
- System instruction format
- Structured output formatting
- Multimodal considerations
- Safety settings integration
This project is licensed under the MIT License - see the LICENSE file for details.