Manages environment variables for database connection strings and model configuration in a secure way.
Provides the runtime environment needed for executing the MCP server application.
Leverages Ollama's LLM capabilities to interpret natural language questions, generate SQL queries, and provide AI-powered responses based on database results.
Provides natural language interface to PostgreSQL databases, enabling SQL query generation, schema-aware responses, and read-only data access through the Model Context Protocol.
Ollama MCP Database Assistant
An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
Features
Natural language interface to your PostgreSQL database
Automatic SQL query generation
Schema-aware responses
Interactive chat interface
Secure, read-only database access
Prerequisites
Node.js 16 or higher
A running PostgreSQL database
Ollama installed and running locally
The qwen2.5-coder:7b-instruct model pulled in Ollama
Setup
Clone the repository:
Install dependencies:
Pull the required Ollama model:
Create a
.env
file in the project root:
Usage
Start the chat interface:
Ask questions about your data in natural language:
Type 'exit' to quit the application.
How It Works
The application connects to your PostgreSQL database through the PostgreSQL MCP server
It loads and caches your database schema
When you ask a question:
The schema and question are sent to Ollama
Ollama generates an appropriate SQL query
The query is executed through MCP
Results are sent back to Ollama for interpretation
You receive a natural language response
Environment Variables
Variable | Description | Default |
DATABASE_URL | PostgreSQL connection string | Required |
OLLAMA_MODEL | Ollama model to use | qwen2.5-coder:7b-instruct |
Security
All database access is read-only
SQL queries are restricted to SELECT statements
Database credentials are kept secure in your .env file
Development
Built with:
TypeScript
Model Context Protocol (MCP)
Ollama
PostgreSQL
Troubleshooting
Common Issues
"Failed to connect to database"
Check your DATABASE_URL in .env
Verify PostgreSQL is running
Check network connectivity
"Failed to connect to Ollama"
Ensure Ollama is running (
ollama serve
)Verify the model is installed (
ollama list
)
"Error executing query"
Check database permissions
Verify table/column names in the schema
License
MIT
Contributing
Fork the repository
Create your feature branch
Commit your changes
Push to the branch
Open a Pull Request
This server cannot be installed
local-only server
The server can only run on the client's local machine because it depends on local resources.
An interactive chat interface that combines Ollama's LLM capabilities with PostgreSQL database access through the Model Context Protocol (MCP). Ask questions about your data in natural language and get AI-powered responses backed by real SQL queries.
Related MCP Servers
- AsecurityAlicenseAqualityMCP Ollama server integrates Ollama models with MCP clients, allowing users to list models, get detailed information, and interact with them through questions.Last updated -327MIT License
- -securityAlicense-qualityEnables seamless integration between Ollama's local LLM models and MCP-compatible applications, supporting model management and chat interactions.Last updated -70103AGPL 3.0
- -securityAlicense-qualityA Model Context Protocol server that provides standardized interfaces for interacting with Ollama API, offering JSON responses, error handling, and intelligent guidance for LLM-based API calls.Last updated -MIT License
- -securityFlicense-qualityA conversational application server that integrates LLM capabilities via Ollama with vector memory context, supporting multiple users, sessions, automatic history summarization, and a plugin system for executing real actions.Last updated -