Supports OpenAI's API for AI-powered prompt enhancement and cleaning using their language models like GPT-4
This project was created as an enchancement to Prompt Cleaner that was written in Typescript. As well as an enhancement. This is my 'rosetta' stone project. That I can easily follow to have a deeper understanding on Python. Oh obvious this was Coded with the help of Cursor
MCP Prompt Cleaner

A Model Context Protocol (MCP) server that uses AI to enhance and clean raw prompts, making them more clear, actionable, and effective.
Features
AI-Powered Enhancement: Uses large language models to improve prompt clarity and specificity
Concise System Prompt: Uses a structured, efficient prompt format for consistent results
Context-Aware Processing: Accepts additional context to guide the enhancement process
Mode-Specific Optimization: Supports both "general" and "code" modes for different use cases
Quality Assessment: Provides quality scores and detailed feedback on enhanced prompts
Two-Level Retry Strategy: HTTP-level retries for network issues, content-level retries for AI output quality
Exponential Backoff: Robust error handling with jitter to prevent thundering herd
MCP Integration: Full MCP protocol compliance with stdio transport
Production Ready: Comprehensive test coverage, clean code, and robust error handling
Installation
Using uv (recommended)
Using pip
Note: This project uses pyproject.toml for dependency management.
Configuration
Local LLM (LMStudio) - Default Setup
The server is configured by default to work with local LLMs like LMStudio. No API key is required:
Cloud LLM (OpenAI, Anthropic, etc.)
For cloud-based LLMs, create a .env file in the project root:
Note: .env file support is provided by pydantic-settings - no additional dependencies required.
LMStudio Setup
Download and install LMStudio
Start LMStudio and load a model
Start the local server (usually on
http://localhost:1234)The MCP server will automatically connect to your local LLM
Running the Server
To run the MCP server:
Tool Usage
The server provides a clean_prompt tool that accepts:
raw_prompt(required): The user's raw, unpolished promptcontext(optional): Additional context about the taskmode(optional): Processing mode - "general" or "code" (default: "general")temperature(optional): AI sampling temperature 0.0-1.0 (default: 0.2)
Example Tool Call
The tool is called directly with parameters:
Or via MCP protocol:
Example Response
MCP Client Configuration
Claude Desktop
For Local LLM (LMStudio) - No API Key Required
For Cloud LLM (OpenAI, etc.) - API Key Required
Other MCP Clients
The server uses stdio transport and can be configured with any MCP-compatible client by pointing to the main.py file.
Development
Running Tests
Test Coverage
The project includes comprehensive tests for:
JSON extraction from mixed content
LLM client with retry logic
Prompt cleaning functionality
MCP protocol integration
Project Structure
Requirements
Python 3.11+
MCP Python SDK
httpx for HTTP client
pydantic for data validation
pytest for testing
License
MIT
This server cannot be installed