Used for environment variable management to configure the PromptLab server with API keys and other settings
Enables version control and contribution workflows for the PromptLab project
Hosts the repository and provides collaboration features for contributing to PromptLab
Integrates with LangChain to provide structured prompt templates and processing workflows
Leverages LangGraph for building the workflow that transforms user queries into optimized prompts
Provides LLM capabilities for the prompt enhancement engine, including content classification and parameter extraction
Provides a template system for defining and managing prompt templates without coding knowledge
PromptLab: AI Query Enhancement with MLflow Integration
PromptLab is an intelligent system that transforms basic user queries into optimized prompts for AI systems using MLflow Prompt Registry. It dynamically matches user requests to the most appropriate prompt template and applies it with extracted parameters.
🔍 Overview
PromptLab combines MLflow Prompt Registry with dynamic prompt matching to create a powerful, flexible system for prompt engineering:
- Centralized Prompt Management: Store, version, and manage prompts in MLflow
- Dynamic Matching: Intelligently match user queries to the best prompt template
- Version Control: Track prompt history with production and archive aliases
- Extensible: Easily add new prompt types without code changes
🏗️ Architecture
The system consists of three main components:
- Prompt Registry (
register_prompts.py
) - Tool for registering and managing prompts in MLflow - Server (
promptlab_server.py
) - Server with dynamic prompt matching and LangGraph workflow - Client (
promptlab_client.py
) - Lightweight client for processing user queries
Workflow Process
- Prompt Registration: Register prompt templates in MLflow with versioning and aliasing
- Prompt Loading: Server loads all available prompts from MLflow at startup
- Query Submission: User submits a natural language query via the client
- Intelligent Matching: LLM analyzes the query and selects the most appropriate prompt template
- Parameter Extraction: System extracts required parameters from the query
- Template Application: Selected template is applied with extracted parameters
- Validation & Adjustment: Enhanced prompt is validated and adjusted if needed
- Response Generation: Optimized prompt produces a high-quality response
📂 Code Structure
Core Components:
register_prompts.py
- Purpose: Manages prompts in MLflow Registry
- Key Functions:
register_prompt()
: Register a new prompt or versionupdate_prompt()
: Update an existing prompt (archives previous production)list_prompts()
: List all registered promptsregister_from_file()
: Register multiple prompts from JSONregister_sample_prompts()
: Initialize with standard prompts
promptlab_server.py
- Purpose: Processes queries using LangGraph workflow
- Key Components:
load_all_prompts()
: Loads prompts from MLflowmatch_prompt()
: Matches queries to appropriate templatesenhance_query()
: Applies selected templatevalidate_query()
: Validates enhanced queriesLangGraph workflow
: Orchestrates the query enhancement process
promptlab_client.py
- Purpose: Provides user interface to the service
- Key Features:
- Process queries with enhanced prompts
- List available prompts
- Display detailed prompt matching information
🚀 Getting Started
Prerequisites
- Python 3.12
- Dependencies in
requirements.txt
- OpenAI API key for LLM capabilities
Installation
Registering Prompts
Before using PromptLab, you need to register prompts in MLflow:
Running the Server
Using the Client
📋 Prompt Management
Available Prompt Types
PromptLab supports a wide range of prompt types:
Prompt Type | Description | Example Use Case |
---|---|---|
essay_prompt | Academic writing | Research papers, analyses |
email_prompt | Email composition | Professional communications |
technical_prompt | Technical explanations | Concepts, technologies |
creative_prompt | Creative writing | Stories, poems, fiction |
code_prompt | Code generation | Functions, algorithms |
summary_prompt | Content summarization | Articles, documents |
analysis_prompt | Critical analysis | Data, texts, concepts |
qa_prompt | Question answering | Context-based answers |
social_media_prompt | Social media content | Platform-specific posts |
blog_prompt | Blog article writing | Online articles |
report_prompt | Formal reports | Business, technical reports |
letter_prompt | Formal letters | Cover, recommendation letters |
presentation_prompt | Presentation outlines | Slides, talks |
review_prompt | Reviews | Products, media, services |
comparison_prompt | Comparisons | Products, concepts, options |
instruction_prompt | How-to guides | Step-by-step instructions |
custom_prompt | Customizable template | Specialized use cases |
Registering New Prompts
You can register new prompts in several ways:
1. From Command Line
2. From a Template File
3. From a JSON File
Create a JSON file with multiple prompts:
Then register them:
Updating Existing Prompts
When you update an existing prompt, the system automatically:
- Archives the previous production version
- Sets the new version as production
Viewing Prompt Details
🛠️ Advanced Usage
Template Variables
Templates use variables in {{ variable }}
format:
When matching a query, the system automatically extracts values for these variables.
Production and Archive Aliases
Each prompt can have different versions with aliases:
- production: The current active version (used by default)
- archived: Previous production versions
This allows for:
- Rolling back to previous versions if needed
- Tracking the history of prompt changes
Custom Prompt Registration
For specialized use cases, you can create highly customized prompts:
🔧 Troubleshooting
No Matching Prompt Found
If the system can't match a query to any prompt template, it will:
- Log a message that no match was found
- Use the original query without enhancement
- Still generate a response
You can add more diverse prompt templates to improve matching.
LLM Connection Issues
If the LLM service is unavailable, the system falls back to:
- Keyword-based matching for prompt selection
- Simple parameter extraction
- Basic prompt enhancement
This ensures the system remains functional even without LLM access.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Serves prompt templates through a standardized protocol for transforming basic user queries into optimized prompts for AI systems.
Related MCP Servers
- AsecurityFlicenseAqualityEnables creation, management, and templating of prompts through a simplified SOLID architecture, allowing users to organize prompts by category and fill in templates at runtime.Last updated -6066TypeScript
- AsecurityAlicenseAqualityProvides pre-defined prompt templates for AI assistants to generate comprehensive plans for TypeScript projects, API architectures, and GitHub workflows.Last updated -1TypeScriptMIT License
- -securityFlicense-qualityA Model Context Protocol implementation for managing and serving AI prompts with a TypeScript-based architecture in a monorepo structure.Last updated -28,919,93814TypeScript
- AsecurityFlicenseAqualityA starter template for building Model Context Protocol servers that can integrate AI assistants with custom tools, resource providers, and prompt templates.Last updated -18TypeScript