Skip to main content
Glama

Prompt Auto-Optimizer MCP

by sloth-wq

Prompt Auto-Optimizer MCP

AI-Powered Prompt Evolution - An MCP server that automatically optimizes your AI prompts using evolutionary algorithms.

TypeScript Node.js

🎯 Purpose

Automatically evolve and optimize AI prompts to improve performance, creativity, and reliability. Uses genetic algorithms to iteratively improve prompts based on real performance data.

🛠️ Installation

# Clone and install git clone https://github.com/your-org/prompt-auto-optimizer-mcp.git cd prompt-auto-optimizer-mcp npm install npm run build # Start the MCP server npm run mcp:start

⚙️ Configuration

Add to your Claude Code settings (.claude/settings.json):

{ "mcp": { "servers": { "prompt-optimizer": { "command": "npx", "args": ["prompt-auto-optimizer-mcp"], "cwd": "./path/to/prompt-auto-optimizer-mcp" } } } }

🔧 Available Tools

Core Optimization Tools

gepa_start_evolution

Start optimizing a prompt using evolutionary algorithms.

{ taskDescription: string; // What you want to optimize for seedPrompt?: string; // Starting prompt (optional) config?: { populationSize?: number; // How many variants to test (default: 20) generations?: number; // How many iterations (default: 10) mutationRate?: number; // How much to change prompts (default: 0.15) }; }

gepa_evaluate_prompt

Test how well a prompt performs on specific tasks.

{ promptId: string; // Which prompt to test taskIds: string[]; // What tasks to test it on rolloutCount?: number; // How many times to test (default: 5) }

gepa_reflect

Analyze why prompts fail and get improvement suggestions.

{ trajectoryIds: string[]; // Which test runs to analyze targetPromptId: string; // Which prompt needs improvement analysisDepth?: 'shallow' | 'deep'; // How detailed (default: 'deep') }

gepa_get_pareto_frontier

Get the best prompt candidates that balance multiple goals.

{ minPerformance?: number; // Minimum quality threshold limit?: number; // Max results to return (default: 10) }

gepa_select_optimal

Choose the best prompt for your specific use case.

{ taskContext?: string; // Describe your use case performanceWeight?: number; // How much to prioritize accuracy (default: 0.7) diversityWeight?: number; // How much to prioritize creativity (default: 0.3) }

gepa_record_trajectory

Log the results of prompt executions for analysis.

{ promptId: string; // Which prompt was used taskId: string; // What task was performed executionSteps: ExecutionStep[]; // What happened during execution result: { success: boolean; // Did it work? score: number; // How well did it work? }; }

Backup & Recovery Tools

  • gepa_create_backup - Save current optimization state

  • gepa_restore_backup - Restore from a previous backup

  • gepa_list_backups - Show available backups

  • gepa_recovery_status - Check system health

  • gepa_integrity_check - Verify data integrity

📝 Basic Usage

  1. Start Evolution: Use gepa_start_evolution with your task description

  2. Record Results: Use gepa_record_trajectory to log how prompts perform

  3. Analyze Failures: Use gepa_reflect to understand what went wrong

  4. Get Best Prompts: Use gepa_select_optimal to find the best candidates

🔧 Environment Variables

# Optional performance tuning GEPA_MAX_CONCURRENT_PROCESSES=3 # Parallel execution limit GEPA_DEFAULT_POPULATION_SIZE=20 # Default prompt variants GEPA_DEFAULT_GENERATIONS=10 # Default iterations

Built for better AI prompts • 📚 Docs • 🐛 Issues

Deploy Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

An MCP server that automatically optimizes AI prompts using evolutionary algorithms, helping improve prompt performance, creativity, and reliability through iterative testing and refinement.

  1. 🎯 Purpose
    1. 🛠️ Installation
      1. ⚙️ Configuration
        1. 🔧 Available Tools
          1. Core Optimization Tools
          2. Backup & Recovery Tools
        2. 📝 Basic Usage
          1. 🔧 Environment Variables

            Related MCP Servers

            • -
              security
              A
              license
              -
              quality
              An MCP server that analyzes codebases and generates contextual prompts, making it easier for AI assistants to understand and work with code repositories.
              Last updated -
              14
              MIT License
            • -
              security
              -
              license
              -
              quality
              An MCP server that provides tools for interacting with Anthropic's prompt engineering APIs, allowing users to generate, improve, and templatize prompts based on task descriptions and feedback.
            • -
              security
              F
              license
              -
              quality
              An MCP server that provides user dialogue capabilities for AI code editors, allowing AI to interact with users through dialog boxes when needing input during the coding process.
            • -
              security
              F
              license
              -
              quality
              An enhanced MCP server that fetches prompts from GitHub repositories with intelligent discovery, composition, and management features.
              Last updated -
              7
              11
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/sloth-wq/prompt-auto-optimizer-mcp'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server