Integrations
Mentioned for hosting capabilities, allowing deployment of the MCP server on Clever Cloud infrastructure
Integrates with GitHub repositories for accessing the RAGmonsters dataset, which serves as the foundation for the MCP server's functionality
Integrates with LangChain.js for LLM interactions, enabling structured communication between the custom MCP server and language models
Custom PostgreSQL MCP Server for RAGmonsters
Overview
This repository demonstrates a more advanced approach to integrating Large Language Models (LLMs) with databases using the Model Context Protocol (MCP). While generic MCP PostgreSQL servers allow LLMs to explore databases through raw SQL queries, this project takes a different approach by creating a custom MCP server that provides a domain-specific API tailored to the application's needs.
This implementation uses FastMCP, a high-performance implementation of the Model Context Protocol, which provides improved efficiency and reliability for tool-based interactions with LLMs.
This project uses the RAGmonsters dataset as its foundation. RAGmonsters is an open-source project that provides a rich, fictional dataset of monsters with various attributes, abilities, and relationships - specifically designed for demonstrating and testing Retrieval-Augmented Generation (RAG) systems.
The Problem with Generic MCP Database Access
Generic MCP PostgreSQL servers provide LLMs with a query
tool that allows them to:
- Explore database schemas
- Formulate SQL queries based on natural language questions
- Execute those queries against the database
While this approach works, it has several limitations for real-world applications:
- Cognitive Load: The LLM must understand the entire database schema
- Inefficiency: Multiple SQL queries are often needed to answer a single question
- Security Concerns: Raw SQL access requires careful prompt engineering to prevent injection attacks
- Performance: Complex queries may be inefficient if the LLM doesn't understand the database's indexing strategy
- Domain Knowledge Gap: The LLM lacks understanding of business rules and domain-specific constraints
About RAGmonsters Dataset
RAGmonsters is an open dataset specifically designed for testing and demonstrating Retrieval-Augmented Generation (RAG) systems. It contains information about fictional monsters with rich attributes, abilities, and relationships - making it perfect for natural language querying demonstrations.
The PostgreSQL version of RAGmonsters provides a well-structured relational database with multiple tables and relationships, including:
- Monsters with various attributes (attack power, defense, health, etc.)
- Abilities that monsters can possess
- Elements (fire, water, earth, etc.) with complex relationships
- Habitats where monsters can be found
- Evolution chains and relationships between monsters
This rich, interconnected dataset is ideal for demonstrating the power of domain-specific APIs versus generic SQL access.
Our Solution: Domain-Specific MCP API
This project demonstrates how to build a custom MCP server that provides a higher-level, domain-specific API for the RAGmonsters dataset. Instead of exposing raw SQL capabilities, our MCP server offers purpose-built functions that:
- Abstract Database Complexity: Hide the underlying schema and SQL details
- Provide Domain-Specific Operations: Offer functions that align with business concepts
- Optimize for Common Queries: Implement efficient query patterns for frequently asked questions
- Enforce Business Rules: Embed domain-specific logic and constraints
- Improve Security: Limit the attack surface by removing direct SQL access
Web Interface
The project includes two main interfaces for interacting with the RAGmonsters dataset:
Explorer Interface
A data-focused interface for exploring and filtering the RAGmonsters dataset through the MCP API:
- Browse all monsters with filtering by category, habitat, and rarity
- View detailed information about each monster
- Interactive UI built with Bootstrap
Chat Interface
A natural language interface for interacting with the RAGmonsters dataset:
- Ask questions about monsters in natural language
- Get Markdown-formatted responses with rich formatting
- Powered by LangGraph's ReAct agent pattern
- Seamless integration with the MCP tools
This interface allows users to:
- Browse all monsters in the dataset
- Filter monsters by habitat, category, and rarity
- View detailed information about each monster, including powers, abilities, strengths, and weaknesses
Example: Domain-Specific API vs. Generic SQL
Generic MCP PostgreSQL Approach:
Our Custom MCP Server Approach:
Project Structure
Features
- Custom MCP Server with FastMCP: High-performance domain-specific API for RAGmonsters data
- Optimized Queries: Pre-built efficient database operations
- Business Logic Layer: Domain rules and constraints embedded in the API
- Structured Response Format: Consistent JSON responses for LLM consumption
- Comprehensive Logging: Detailed logging for debugging and monitoring
- Test Suite: Scripts to verify server functionality and LLM integration
- LLM Integration:
- LangChain.js integration with OpenAI and other compatible LLM providers
- LangGraph ReAct agent pattern for efficient tool use
- Automatic handling of tool calls and responses
- Web Interfaces:
- Explorer interface for browsing and filtering monsters
- Chat interface with Markdown rendering for natural language interaction
Features
- LangChain.js Integration: Fully integrated LLM interactions with MCP tools
- Web Interface: Explorer and chat interfaces for interacting with the RAGmonsters dataset
- Deployment Ready: Configured for easy deployment on platforms like Clever Cloud
Benefits of This Approach
- Improved Performance: Optimized queries and caching strategies
- Better User Experience: More accurate and faster responses
- Reduced Token Usage: LLM doesn't need to process complex SQL or schema information
- Enhanced Security: No direct SQL access means reduced risk of injection attacks
- Maintainability: Changes to the database schema don't require retraining the LLM
- Scalability: Can handle larger and more complex databases
Getting Started
Installation
- Clone this repository
- Install dependencies:
npm install
- Copy
.env.example
to.env
and configure your PostgreSQL connection string and LLM API keys - Run the MCP server test script:
npm run test
- Run the LLM integration test script:
npm run test:llm
- Start the server:
npm start
Available Tools
The MCP server provides the following tools:
- getMonsters - Get a list of monsters with optional filtering, sorting, and pagination
- Parameters: filters (category, habitat, rarity), sort (field, direction), limit, offset
- Returns: Array of monster objects with basic information
- getMonsterById - Get detailed information about a specific monster by ID
- Parameters: monsterId
- Returns: Detailed monster object with all attributes, powers, abilities, strengths, and weaknesses
- add - Simple utility to add two numbers (for testing)
- Parameters: a, b
- Returns: Sum of the two numbers
LLM Integration Architecture
This project uses a modern approach to LLM integration with domain-specific tools:
LangGraph ReAct Agent Pattern
The application uses LangGraph's ReAct (Reasoning and Acting) agent pattern, which:
- Processes user queries to understand intent
- Determines which tools to use based on the query
- Automatically executes the appropriate tools
- Synthesizes results into a coherent response
- Handles multi-step reasoning when needed
Testing LLM Integration
The project includes a test script that demonstrates how to use LangChain.js to integrate an LLM with the MCP server:
This script:
- Connects to the MCP server using the StdioClientTransport
- Loads all available MCP tools using LangChain's MCP adapters
- Creates a LangChain agent with the OpenAI API
- Processes a natural language query about monsters
- Shows how the LLM makes tool calls to retrieve information
- Logs detailed information about the interaction
You can modify the test queries in the script to explore different capabilities of the system. The script is located at scripts/testLlmWithMcpServer.js
.
Prerequisites
- Node.js 23 or later
- PostgreSQL database with RAGmonsters data
- Access to an LLM API (e.g., OpenAI)
- FastMCP package (included in dependencies)
Environment Variables
Create a .env
file with the following variables:
LLM Configuration
- LLM_API_KEY: Your OpenAI API key or compatible provider key
- LLM_API_MODEL: The model to use (default: gpt-4o-mini)
- LLM_API_URL: The API endpoint (default: OpenAI's endpoint)
The application supports any OpenAI-compatible API, including self-hosted models and alternative providers.
Deploying to Clever Cloud
Using the Clever Cloud CLI
- Install the Clever Cloud CLI:Copy
- Login to your Clever Cloud account:Copy
- Create a new application:Copy
- Add your domain (optional but recommended):Copy
- Create a PostgreSQL add-on and link it to your application:This will automatically set theCopy
POSTGRESQL_ADDON_URI
environment variable in your application. - Set the required environment variables:Copy
- Deploy your application:Copy
- Open your application:Copy
Using the Clever Cloud Console
You can also deploy directly from the Clever Cloud Console:
- Create a new application in the console
- Select Node.js as the runtime
- Create a PostgreSQL add-on and link it to your application
- Set the required environment variables in the console:
LLM_API_KEY
: Your OpenAI API keyLLM_API_MODEL
: (Optional) The model to use, defaults to gpt-4o-mini
- Deploy your application using Git or GitHub integration
Important Notes
- The
POSTGRESQL_ADDON_URI
environment variable is automatically set by Clever Cloud when you link a PostgreSQL add-on to your application - The application requires Node.js 20 or later, which is available on Clever Cloud
- The application will automatically run on port 8080, which is the default port for Node.js applications on Clever Cloud
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- RAGmonsters for the sample dataset
- Model Context Protocol for the MCP specification
- FastMCP for the high-performance MCP implementation
- Clever Cloud for hosting capabilities
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
A domain-specific MCP server that provides optimized API access to the RAGmonsters fictional monster dataset, enabling more efficient and secure interactions compared to generic SQL queries.
Related MCP Servers
- -securityAlicense-qualityA server that helps people access and query data in databases using the Query Runner with integration of the Model Context Protocol (MCP) Python SDK. Support databases including PostgreSQL Redshift MySQL Microsoft SQL Server Google APIs Amazon Web Services (via boto3) CockroachDB SQLiteLast updated -26PythonGPL 3.0
- -securityAlicense-qualityA lightweight MCP server that enables database access and querying through ODBC connections, with special support for Virtuoso DBMS features like SPARQL and AI assistance through natural language.Last updated -PythonMIT License
- -securityAlicense-qualityOpen source MCP server specializing in easy, fast, and secure tools for Databases.Last updated -929GoApache 2.0
- -security-license-qualityAn MCP server that enables natural language interaction with Apache Iceberg data lakehouses, allowing users to query table metadata, schemas, and properties through Claude, Cursor, or other MCP clients.Last updated -31PythonApache 2.0