PROMPT_PROCESSOR_README.mdā¢6.34 kB
# š Prompt Processor System
## Overview
The **Prompt Processor** is a powerful new tool that automatically enhances user messages with comprehensive context before sending them to AI assistants. It's designed to give you complete control over conversation memory and context injection.
## ⨠Features
### **1. Conversation Summary**
- Analyzes current conversation state
- Tracks recent topics and interactions
- Provides context-aware summaries
### **2. Action History**
- Records detailed steps taken
- Tracks user requests and agent responses
- Maintains conversation flow history
### **3. Tech Stack Definition**
- Documents your current technology stack
- Includes Python, SQLite, MCP, SQLAlchemy
- Tracks dependencies and capabilities
### **4. Project Plans & Objectives**
- Maintains list of current goals
- Tracks completion status
- Shows progress on key initiatives
### **5. User Preferences**
- Stores your development preferences
- Remembers tool choices (SQLite over PostgreSQL)
- Tracks coding style preferences
### **6. Agent Metadata**
- Friendly name: **Johny**
- Agent ID and capabilities
- Current status and version
## š ļø Usage
### **Basic Usage**
```bash
# Use the prompt processor directly
process_prompt_with_context --user_message "How do I configure the database?"
```
### **Integration with AI Assistants**
The prompt processor generates enhanced prompts like this:
```
=== ENHANCED PROMPT GENERATED BY JOHNY ===
USER MESSAGE: How do I configure the database?
=== CONTEXT INJECTION ===
CONVERSATION SUMMARY:
Current conversation state: 9 total interactions. Recent topics: Testing conversation tracking verification, conversation memory, tool documentation
ACTION HISTORY:
Recent actions: Conversation turn: Testing conversation tracking verification... | User request: How do we get our conversation into memory...
TECH STACK:
Python 3.x, SQLite database, MCP (Model Context Protocol), FastMCP server, SQLAlchemy ORM...
PROJECT PLANS & OBJECTIVES:
1. Build powerful conversation tracking system ā
2. Implement context-aware prompt processing ā
3. Create intelligent memory management system ā
...
USER PREFERENCES:
- Use local SQLite over PostgreSQL for development
- Prefer simple yet powerful solutions
- Focus on conversation context and memory
...
AGENT METADATA:
- Friendly Name: Johny
- Agent ID: mcp-project-local
- Type: Context-Aware Conversation Manager
...
=== INSTRUCTIONS ===
Please respond to the user's message above, taking into account:
1. The current conversation context and recent interactions
2. The specific actions and steps taken so far
3. The technical stack and capabilities available
4. The project goals and objectives
5. The user's stated preferences and requirements
6. The agent's capabilities and current state
Provide a comprehensive, context-aware response that builds upon our conversation history.
=== END ENHANCED PROMPT ===
```
## š§ Configuration
### **Customizing Tech Stack**
Edit the `_get_tech_stack_definition()` function in `local_mcp_server_simple.py`:
```python
def _get_tech_stack_definition() -> str:
return """Tech Stack: Your custom tech stack here"""
```
### **Updating Project Plans**
Modify the `_get_project_plans()` function:
```python
def _get_project_plans() -> str:
return """Project Plans & Objectives:
1. Your custom plan here
2. Another objective
..."""
```
### **Setting User Preferences**
Update the `_get_user_preferences()` function:
```python
def _get_user_preferences() -> str:
return """User Preferences:
- Your preference here
- Another preference
..."""
```
### **Modifying Agent Metadata**
Change the `_get_agent_metadata()` function:
```python
def _get_agent_metadata() -> str:
return """Agent Metadata:
- Friendly Name: Your Agent Name
- Agent ID: your-agent-id
- Type: Your Agent Type
..."""
```
## š Integration Examples
### **1. Direct MCP Tool Usage**
```python
# In your MCP server
enhanced_prompt = process_prompt_with_context("Your message here")
# Send enhanced_prompt to AI assistant
```
### **2. Chat Interface Integration**
```python
def chat_with_context(user_input: str):
# Step 1: Process with context
enhanced_prompt = process_prompt_with_context(user_input)
# Step 2: Send to AI assistant
ai_response = send_to_ai_assistant(enhanced_prompt)
# Step 3: Log everything
log_conversation(user_input, enhanced_prompt, ai_response)
return ai_response
```
### **3. Automated Context Injection**
```python
# Every user message automatically gets enhanced
user_message = "How do I deploy this?"
enhanced_message = process_prompt_with_context(user_message)
# enhanced_message now contains full context
```
## š Benefits
- ā
**Context-Aware Responses** - AI assistants understand your full conversation history
- ā
**Consistent Memory** - All interactions are tracked and referenced
- ā
**Personalized Experience** - Your preferences and plans are always considered
- ā
**Efficient Communication** - No need to repeat context in every message
- ā
**Project Continuity** - Long-running projects maintain context across sessions
## š Monitoring
### **Check Current Context**
```bash
# See what context is being generated
extract_conversation_data --limit 5
```
### **View Conversation Analytics**
```bash
# Get insights about context usage
get_conversation_analytics
```
### **Test the System**
```bash
# Test prompt processing
test_conversation_tracking --message "Test message"
```
## šÆ Next Steps
1. **Test the prompt processor** with your current conversations
2. **Customize the context components** to match your needs
3. **Integrate with your preferred AI assistant** interface
4. **Monitor and refine** the context generation
## š Troubleshooting
### **Common Issues**
- **Import errors**: Make sure you're running from the project directory
- **Database connection**: Ensure `init_db.py` has been run
- **MCP server**: Start the server with `python3 local_mcp_server_simple.py`
### **Getting Help**
- Check the conversation logs: `extract_conversation_data`
- View system status: `get_conversation_summary`
- Test the system: `test_conversation_tracking`
---
**Built with ā¤ļø by Johny - Your Context-Aware Conversation Manager**