Skip to main content
Glama

Database MCP Server

by kunwarmahen

Database MCP Server

Query and manage databases through the Model Context Protocol.

Overview

The Database MCP Server provides AI-accessible database operations for PostgreSQL and MongoDB. It enables:

  • PostgreSQL queries and data management

  • MongoDB document operations

  • Automatic schema discovery

  • Network-wide database access through MCP Discovery Hub

  • Zero-configuration deployment with automatic broadcasting

Perfect for building AI applications that need to interact with databases safely and efficiently.

Features

PostgreSQL Support

  • Get server version and database info

  • List tables in any schema

  • Query data with configurable limits

  • Insert new records

  • SQL validation and safety checks

MongoDB Support

  • List collections

  • Find documents with filters

  • Insert documents

  • ObjectId handling and JSON serialization

Network Integration

  • Automatic multicast broadcasting for discovery

  • Multi-transport support (HTTP and streamable-http)

  • Compatible with MCP Discovery Hub

  • Zero-configuration networking

Installation

Prerequisites

  • Python 3.10+

  • PostgreSQL server (or MongoDB, or both)

  • uv package manager (or pip)

Setup

# Clone or navigate to project cd database-mcp-server # Install dependencies uv sync # Or with pip: pip install -r requirements.txt

Configuration

Environment Variables

# Transport mode MCP_TRANSPORT=http # http, streamable-http, or stdio (default) # Server settings MCP_HOST=0.0.0.0 # Binding host MCP_PORT=3002 # Server port MCP_SERVER_NAME=Database MCP Server # Display name # PostgreSQL DATABASE_URL=postgresql://user:pass@localhost:5432/dbname # Or individual settings: POSTGRES_USER=postgres POSTGRES_PASSWORD= POSTGRES_HOST=localhost POSTGRES_PORT=5432 POSTGRES_DB=postgres # MongoDB MONGODB_URL=mongodb://localhost:27017 MONGODB_DB=test # Broadcasting (for MCP Discovery Hub) MCP_ENABLE_BROADCAST=true # Enable/disable broadcasting MCP_BROADCAST_INTERVAL=30 # Seconds between announcements

.env File

Create a .env file in the project root:

# Database Connections DATABASE_URL=postgresql://postgres:password@localhost:5432/mydb MONGODB_URL=mongodb://localhost:27017 MONGODB_DB=mydb # MCP Server MCP_TRANSPORT=http MCP_PORT=3002 MCP_SERVER_NAME=Database MCP Server MCP_ENABLE_BROADCAST=true MCP_BROADCAST_INTERVAL=30

Docker Example

# With PostgreSQL in Docker docker run -d \ -e POSTGRES_PASSWORD=mypassword \ -p 5432:5432 \ postgres:15 # With MongoDB in Docker docker run -d \ -p 27017:27017 \ mongo:latest # Start MCP server MCP_TRANSPORT=http MCP_PORT=3002 uv run main.py

Usage

Start in HTTP Mode (with broadcasting)

# Using environment variables MCP_TRANSPORT=http MCP_PORT=3002 uv run main.py # Or with .env file uv run main.py

Start in Streamable-HTTP Mode

MCP_TRANSPORT=streamable-http MCP_PORT=3002 uv run main.py

Start in Stdio Mode (for Claude)

# Default mode, works with Claude Desktop uv run main.py

Available Tools

PostgreSQL Tools

Get DB Version

pg_version()

Retrieve PostgreSQL server version information

List Tables

pg_list_tables(schema: str = "public")

List all tables in a schema

Example:

{ "method": "tools/call", "params": { "name": "pg_list_tables", "arguments": { "schema": "public" } } }

List Rows

pg_list_rows(table: str, limit: int = 100)

Query data from a table with limit

Example:

{ "method": "tools/call", "params": { "name": "pg_list_rows", "arguments": { "table": "users", "limit": 50 } } }

Insert Row

pg_insert_row(table: str, data: dict)

Insert a new record and return the inserted ID

Example:

{ "method": "tools/call", "params": { "name": "pg_insert_row", "arguments": { "table": "users", "data": { "name": "John", "email": "john@example.com" } } } }

MongoDB Tools

List Collections

mongo_list_collections()

Get all collection names in the database

Find Documents

mongo_find( collection: str, query: dict = {}, limit: int = 10 )

Query documents from a collection

Example:

{ "method": "tools/call", "params": { "name": "mongo_find", "arguments": { "collection": "users", "query": { "status": "active" }, "limit": 20 } } }

Insert Document

mongo_insert(collection: str, doc: dict)

Insert a document into a collection

Example:

{ "method": "tools/call", "params": { "name": "mongo_insert", "arguments": { "collection": "logs", "doc": { "timestamp": "2024-10-17T10:00:00Z", "level": "info", "message": "Server started" } } } }

Integration with MCP Discovery Hub

Automatic Discovery

When broadcasting is enabled, the database server automatically registers:

  1. Server broadcasts: Every 30 seconds on 239.255.255.250:5353

  2. Hub discovers: Discovery hub receives and probes the server

  3. Tools registered: All 7 database tools become available network-wide

Multi-Server Setup

Deploy multiple database servers for different purposes:

Database Server 1 (PostgreSQL, port 3002) ↓ Database Server 2 (MongoDB, port 3003) ↓ Database Server 3 (Mixed, port 3004) ↓ MCP Discovery Hub (port 8000) ↓ AI Tool (Claude, etc.)

All servers discovered and available to AI automatically.

API Endpoints (When in HTTP Mode)

GET /

Server information

curl http://localhost:3002/

POST /mcp

MCP protocol endpoint

All MCP communication (initialize, tools/list, tools/call)

Use Cases

1. Data Analysis

AI-powered analysis of your database:

"User: Summarize user activity from the last month" AI: I'll query the activity logs for you... → calls pg_list_rows(table="activity_logs", limit=1000) → analyzes and summarizes results

2. Automated Reporting

Generate reports from database data:

"User: Create a report of orders by region" AI: Let me fetch the order data... → calls pg_list_rows(table="orders", limit=10000) → groups and aggregates by region → generates report

3. Data Entry and Updates

AI-assisted data entry:

"User: Add a new customer with this information" AI: I'll add them to the database... → calls pg_insert_row(table="customers", data={...})

4. Document Search and Retrieval

MongoDB document management:

"User: Find all documents with status pending" AI: Searching for pending documents... → calls mongo_find(collection="tasks", query={"status": "pending"})

5. System Monitoring

Database health and activity monitoring:

"User: Check if there are any slow queries" AI: Let me check the query logs... → calls pg_list_rows(table="query_logs") → identifies slow queries

Safety Features

Input Validation

  • Table and column names validated against regex

  • SQL injection prevention through parameterized queries

  • Data type validation for inserts

Error Handling

  • Database connection errors caught and reported

  • Timeout protection (30 seconds default)

  • Clear error messages for debugging

Best Practices

  1. Read-only operations first: Start with queries before modifying data

  2. Use limits: Always set reasonable limits on queries

  3. Monitor logs: Check database_mcp.log for issues

  4. Backup data: Ensure backups before AI access to production

  5. Audit trail: Log all database modifications from MCP

Performance Considerations

  • Query performance: Depends on query complexity and data size

  • Connection pooling: PostgreSQL pool_size=5 for concurrency

  • Broadcasting overhead: Minimal (30-byte UDP packets)

  • Timeout protection: 30-second limit on operations

Optimization Tips

  • Use limit parameter to reduce data transfer

  • Filter documents with query parameter in MongoDB

  • Create appropriate database indexes for common queries

  • Use schema parameter to narrow PostgreSQL searches

Logs

Server logs are written to database_mcp.log:

# View logs tail -f database_mcp.log # Check for errors grep ERROR database_mcp.log # Monitor database operations grep "Listing tables\|Inserting\|Finding" database_mcp.log

Troubleshooting

PostgreSQL Connection Error

# Check PostgreSQL is running psql postgresql://user:pass@localhost:5432/db # Verify credentials in .env echo $DATABASE_URL

MongoDB Connection Error

# Check MongoDB is running mongo --eval "db.version()" # Verify connection string echo $MONGODB_URL

Broadcasting Not Working

# Verify multicast is enabled ip route show | grep 239.255.255.250 # Check firewall settings sudo firewall-cmd --list-all

Port Already in Use

# Use different port MCP_PORT=3003 uv run main.py

Performance Metrics

Typical response times:

  • Simple SELECT: 10-50ms

  • Database info queries: 5-20ms

  • MongoDB find operations: 20-100ms

  • Insert operations: 30-200ms (depending on triggers)

Network overhead (with broadcasting):

  • Broadcasting: 0.01% overhead

  • Discovery: One-time cost per server

Requirements

  • Python 3.10+

  • FastAPI

  • SQLAlchemy

  • PyMongo

  • FastMCP

  • python-dotenv

Contributing

Improvements welcome! Potential enhancements:

  • Additional database support (MySQL, SQLite)

  • Stored procedure execution

  • Transaction support

  • Advanced query builder

  • Connection pooling configuration

  • Database replication support

License

MIT License - See LICENSE file for details

Support

  • Issues: Report on GitHub

  • Documentation: See MCP Discovery Hub wiki

  • Examples: Check examples/ directory

  • Database docs: PostgreSQL and MongoDB official documentation

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/kunwarmahen/db-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server