The MCP Vertica server provides a local NLP-to-SQL interface for Vertica databases, running entirely locally with no authentication required. Key capabilities include:
• Execute SQL queries directly against Vertica with streaming results and batching options
• Natural language to SQL conversion using a local LLM (Ollama) for both data queries and DDL operations like table creation
• Data loading via bulk COPY operations into Vertica tables
• Database introspection to retrieve table structures, columns, types, constraints, indexes, and views
• Similar incident analysis to find incidents based on ID or text description
• Multiple access methods: REST API at http://127.0.0.1:8001/api/
, terminal commands via uvx mcp-vertica nlp
, and SSE MCP server on port 8000
• Dry-run capabilities to preview generated SQL without execution
• Local deployment using Docker and Ollama for the complete stack
Runs Vertica CE database via Docker containers for local development and testing environments
Integrates with Ollama's local LLM service to provide natural language to SQL query translation capabilities for Vertica databases
Provides Python-based tools for executing SQL queries, managing database schemas, copying data, and streaming query results from Vertica databases
mcp-vertica — Local NLP + REST for Vertica (no auth)
This runs entirely on your laptop: Vertica CE via Docker, a local REST API, and a terminal NLP→SQL command powered by a local LLM (Ollama). No auth, bound on 0.0.0.0
for convenience.
⚠️ Security is intentionally disabled for local demos. Do not expose to the public internet.
Prerequisites
- Docker Desktop
- Python 3.12+
- uv (recommended) or
pip
- Ollama (for local LLM)
- Mac:
brew install ollama
→ollama serve &
→ollama pull llama3.1:8b
- Windows: install Ollama app → run “Ollama” → in PowerShell:
ollama pull llama3.1:8b
- Mac:
- (Optional) A Vertica instance; we provide Docker.
1) Start Vertica locally
Defaults:
Host: localhost
Port: 5433
Database: VMart
User: dbadmin
Password: (empty)
2) Install & configure mcp-vertica
Set env (Mac/Linux bash or zsh):
Windows (PowerShell):
3) Seed ITSM/CMDB sample data
4) REST API (no auth)
Test:
NLP endpoint:
5) NLP from terminal
Start Ollama in background (if not already):
Examples:
6) SSE MCP server (unchanged)
Troubleshooting
If MCP client can’t connect: uv cache clean and retry.
If Vertica not ready: docker logs vertica-ce and re-run after healthy.
If Ollama fails: ensure ollama serve is running and you pulled a model.
License
MIT
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Enables interaction with Vertica databases through SQL queries, schema management, and bulk data operations. Supports connection pooling, SSL/TLS security, and configurable permissions for database operations.