# Ollama + PostgreSQL MCP Server Integration
This example demonstrates how to use **Ollama** (local LLM) with the **PostgreSQL MCP Server** to perform natural language database queries. Users can ask questions in plain English, and Ollama will automatically generate and execute the appropriate database queries.
## ๐ Features
- **Natural Language Queries** - Ask questions in plain English
- **Intelligent Tool Selection** - Ollama automatically chooses the right database tools
- **Query Validation** - Automatic security and performance analysis
- **Interactive Chat** - Conversational interface with memory
- **Multiple Models** - Support for different Ollama models
- **Error Handling** - Graceful handling of connection and query errors
- **Real-time Results** - Immediate database responses formatted for humans
## ๐ Prerequisites
### 1. Install Ollama
```bash
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.ai/install.sh | sh
# Windows - Download from https://ollama.ai/
```
### 2. Pull an LLM Model
```bash
# Recommended models
ollama pull llama3.1 # Default, good balance of speed and capability
ollama pull llama3.1:70b # Larger, more capable but slower
ollama pull mistral # Alternative, faster but smaller
ollama pull codellama # Specialized for code understanding
```
### 3. Install Python Dependencies
```bash
pip install ollama mcp
```
### 4. Configure PostgreSQL MCP Server
Make sure your PostgreSQL MCP Server is configured and working:
```bash
# Test the MCP server
python main.py --test
```
## ๐ ๏ธ Setup
1. **Clone and Setup the MCP Server** (if not already done):
```bash
git clone <your-repo-url>
cd postgresql-mcp-server
pip install -r requirements.txt
cp .env.example .env
# Edit .env with your database credentials
```
2. **Verify Ollama is Running**:
```bash
ollama list # Should show your installed models
```
3. **Test the Integration**:
```bash
python examples/ollama-integration.py --mode examples
```
## ๐ฏ Usage
### Interactive Mode (Recommended)
```bash
python examples/ollama-integration.py
```
This starts an interactive chat session where you can ask natural language questions:
```
๐ PostgreSQL MCP + Ollama Integration
==================================================
โ
Ready! You can now ask questions about your database.
๐ก Try questions like:
- 'Show me the recent orders'
- 'What are the top selling products?'
- 'Validate this query: SELECT * FROM users'
- 'What tables are available?'
Type 'quit' to exit.
๐ค You: What tables are in the database?
๐ค I'll check what tables are available in your database.
๐ง Using tool: list_tables
๐ค Assistant: Your database contains the following tables:
- **users** (BASE TABLE) - User account information
- **categories** (BASE TABLE) - Product categories
- **products** (BASE TABLE) - Product catalog
- **addresses** (BASE TABLE) - Customer addresses
- **orders** (BASE TABLE) - Order records
- **order_items** (BASE TABLE) - Individual items in orders
- **cart_items** (BASE TABLE) - Shopping cart contents
- **reviews** (BASE TABLE) - Product reviews
- **coupons** (BASE TABLE) - Discount coupons
- **order_coupons** (BASE TABLE) - Applied coupons
This looks like an e-commerce database! You can ask me about sales data, customer information, product performance, or any other aspects of this data.
```
### Example Queries Mode
```bash
python examples/ollama-integration.py --mode examples
```
This runs predefined examples to demonstrate capabilities.
### Using Different Models
```bash
python examples/ollama-integration.py --model mistral
python examples/ollama-integration.py --model codellama
```
## ๐ฌ Example Conversations
### Sales Analysis
```
๐ค You: Show me the sales performance for last month
๐ค I'll analyze your sales performance for the last month.
๐ง Using tool: query
๐ค Assistant: Here's your sales performance for the last month:
๐ **Sales Summary (Last 30 Days)**
- **Total Orders:** 152
- **Total Revenue:** $45,280.50
- **Average Order Value:** $297.90
- **Unique Customers:** 89
๐ **Key Insights:**
- Strong performance with nearly $45K in revenue
- Healthy average order value of ~$298
- Good customer diversity with 89 unique buyers
- Daily average of about 5 orders per day
Would you like me to break this down by day, product category, or customer segments?
```
### Query Validation
```
๐ค You: Is this query safe: SELECT * FROM users WHERE id = 1 OR 1=1
๐ค Let me validate that query for security and performance issues.
๐ง Using tool: validate_query
๐ค Assistant: โ ๏ธ **Security Alert!** This query has serious issues:
๐ **Security Problems:**
- **SQL Injection Risk:** The `OR 1=1` condition is a classic SQL injection pattern
- **Data Exposure:** This would return ALL users in the database, not just user ID 1
โก **Performance Issues:**
- **SELECT *:** Retrieves all columns, which can be inefficient
- **No LIMIT:** Could return huge amounts of data
โ
**Safer Alternative:**
```sql
SELECT id, email, first_name, last_name
FROM users
WHERE id = $1
LIMIT 1
```
**Key Improvements:**
- Use parameterized queries ($1) instead of direct values
- Select only needed columns
- Add LIMIT to prevent large result sets
- Remove the dangerous OR condition
Never use `OR 1=1` in production queries - it's a security vulnerability!
```
### Product Analysis
```
๐ค You: What are the top 5 selling products?
๐ค I'll find your top-selling products by revenue.
๐ง Using tool: query
๐ค Assistant: ๐ **Top 5 Selling Products (Last 30 Days)**
1. **MacBook Pro 16"**
- Revenue: $19,992.00
- Units Sold: 8
- Average Price: $2,499.00
2. **iPhone 15 Pro**
- Revenue: $14,985.00
- Units Sold: 15
- Average Price: $999.00
3. **The Great Gatsby**
- Revenue: $584.55
- Units Sold: 45
- Average Price: $12.99
๐ก **Insights:**
- **High-value electronics dominate** - MacBook and iPhone account for 77% of revenue
- **Strong book sales volume** - 45 units of Gatsby shows good catalog diversity
- **Premium pricing strategy working** - High-ticket items driving revenue
Would you like me to analyze profit margins, inventory levels, or customer segments for these products?
```
## ๐ก๏ธ Security Features
The integration maintains all security features of the PostgreSQL MCP Server:
- โ
**Read-only queries only** - No data modification possible
- โ
**SQL injection prevention** - Automatic detection and blocking
- โ
**Query validation** - Performance and security analysis
- โ
**Row limits** - Prevents large result sets
- โ
**Schema restrictions** - Configurable access control
## โ๏ธ Configuration Options
### Model Selection
Choose different Ollama models based on your needs:
| Model | Size | Speed | Capability | Best For |
|-------|------|--------|------------|----------|
| `llama3.1` | ~4GB | Fast | Good | General queries, recommended |
| `llama3.1:70b` | ~40GB | Slow | Excellent | Complex analysis, production |
| `mistral` | ~4GB | Very Fast | Good | Quick responses |
| `codellama` | ~4GB | Fast | Code-focused | SQL query generation |
### Advanced Configuration
You can customize the bridge behavior:
```python
# Custom MCP server command
bridge = OllamaMCPBridge(
ollama_model="llama3.1:70b",
mcp_server_command=["python", "main.py", "--config", "prod.json"]
)
```
## ๐ง Troubleshooting
### Common Issues
**1. "Ollama not available"**
```bash
# Check if Ollama is running
ollama list
# Start Ollama if needed
ollama serve
```
**2. "Model not found"**
```bash
# Install the model
ollama pull llama3.1
# Check available models
ollama list
```
**3. "MCP server connection failed"**
```bash
# Test MCP server separately
python main.py --test
# Check database configuration
python -c "from src.config import load_config; print(load_config())"
```
**4. "Import errors"**
```bash
# Install missing dependencies
pip install ollama mcp
# Or install all project dependencies
pip install -r requirements.txt
```
### Performance Tips
1. **Use smaller models** for faster responses (`mistral` vs `llama3.1:70b`)
2. **Keep conversations short** - history is limited to last 10 exchanges
3. **Ask specific questions** - more focused queries get better results
4. **Use query validation** before running complex queries
## ๐ Supported Query Types
### โ
Well-Supported
- **Table exploration** - "What tables exist?", "Describe the users table"
- **Data analysis** - "Show sales trends", "Top customers"
- **Query validation** - "Check this SQL for issues"
- **Simple aggregations** - "Count orders", "Sum revenue"
- **Filtering and sorting** - "Recent orders", "Active users"
### โ ๏ธ Partially Supported
- **Complex joins** - May need guidance on relationships
- **Advanced analytics** - Window functions, CTEs may need examples
- **Schema modifications** - Read-only mode prevents changes
### โ Not Supported
- **Data modifications** - INSERT, UPDATE, DELETE blocked for security
- **Schema changes** - CREATE, ALTER, DROP not allowed
- **Administrative queries** - System tables and functions restricted
## ๐ฎ Advanced Usage
### Custom System Prompts
You can modify the system prompt for specialized use cases:
```python
class CustomBridge(OllamaMCPBridge):
def create_system_prompt(self) -> str:
base_prompt = super().create_system_prompt()
return base_prompt + """
Additional Context:
- This is an e-commerce database for "TechStore Inc"
- Focus on business metrics and KPIs
- Always suggest actionable insights
- Format currency in USD
"""
```
### Integration with Other Tools
The bridge can be extended to work with:
- **Jupyter notebooks** - For data science workflows
- **Web applications** - Flask/FastAPI integration
- **Slack bots** - Customer support automation
- **Monitoring systems** - Automated alerting
## ๐ฏ Best Practices
1. **Start with simple questions** to understand your data structure
2. **Use query validation** before running complex queries
3. **Ask for explanations** when results are unclear
4. **Break complex questions** into smaller parts
5. **Validate assumptions** about data relationships
## ๐ Next Steps
1. **Customize for your domain** - Add business-specific context to prompts
2. **Build automation** - Schedule regular reports or alerts
3. **Extend functionality** - Add visualization or export capabilities
4. **Production deployment** - Scale with proper error handling and monitoring
This integration opens up powerful possibilities for natural language database interaction! ๐