Provides text generation capabilities using self-hosted LLM models (Llama2, CodeLlama, etc.) through the Ollama service, allowing AI agents to generate text with configurable parameters like temperature and token limits.
Enables complete database operations including SQL queries, data storage, and retrieval through Supabase's database service, providing CRUD functionality for managing structured data.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Self-hosted LLM MCP Serversummarize the latest customer feedback from the database"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Server with Self-hosted LLM and Supabase Integration
A comprehensive Model Context Protocol (MCP) server that integrates with self-hosted LLM models via Ollama and Supabase database for data persistence and retrieval.
Features
MCP Protocol Support: Full implementation of the Model Context Protocol specification
Self-hosted LLM Integration: Support for Ollama-based LLM models (Llama2, CodeLlama, etc.)
Supabase Database Integration: Complete CRUD operations with Supabase
Docker Support: Containerized deployment with Docker Compose
Comprehensive Testing: Unit tests with ≥90% coverage, integration tests, and E2E tests
TypeScript: Fully typed implementation for better development experience
Logging: Structured logging with configurable levels and formats
Architecture
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ MCP Client │ │ MCP Server │ │ Supabase DB │
│ │◄──►│ │◄──►│ │
└─────────────────┘ └─────────────────┘ └─────────────────┘
│
▼
┌─────────────────┐
│ Ollama LLM │
│ (Self-hosted) │
└─────────────────┘Quick Start
Prerequisites
Docker and Docker Compose
Node.js 18+ (for local development)
Supabase account and project
1. Clone and Setup
git clone <repository-url>
cd mcp-server-selfhosted
cp env.example .env2. Configure Environment
Edit .env file with your configuration:
# Supabase Configuration
SUPABASE_URL=your_supabase_url_here
SUPABASE_ANON_KEY=your_supabase_anon_key_here
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key_here
# Self-hosted LLM Configuration
LLM_BASE_URL=http://localhost:11434
LLM_MODEL=llama2
LLM_TIMEOUT=30000
# MCP Server Configuration
MCP_SERVER_PORT=3000
MCP_SERVER_HOST=localhost
# Logging
LOG_LEVEL=info
LOG_FORMAT=json3. Start with Docker Compose
docker-compose up -dThis will start:
Ollama service (self-hosted LLM)
MCP Server
Health checks and monitoring
4. Verify Installation
# Check if services are running
docker-compose ps
# Test MCP server health
curl http://localhost:3000/health
# Test Ollama connection
curl http://localhost:11434/api/tags5. Test Build Locally (Optional)
# Test TypeScript compilation
npm run build
# Test HTTP server
npm run start:http
# Test health endpoint
curl http://localhost:3000/healthAvailable Tools
The MCP server provides the following tools:
1. query_database
Execute SQL queries on the Supabase database.
Parameters:
query(string, required): SQL query to executetable(string, optional): Table name for context
Example:
{
"name": "query_database",
"arguments": {
"query": "SELECT * FROM users WHERE active = true",
"table": "users"
}
}2. generate_text
Generate text using the self-hosted LLM.
Parameters:
prompt(string, required): Text prompt for the LLMmaxTokens(number, optional): Maximum tokens to generatetemperature(number, optional): Temperature for generation (0.0-1.0)
Example:
{
"name": "generate_text",
"arguments": {
"prompt": "Explain quantum computing in simple terms",
"maxTokens": 500,
"temperature": 0.7
}
}3. store_data
Store data in the Supabase database.
Parameters:
table(string, required): Table name to store datadata(object, required): Data to store
Example:
{
"name": "store_data",
"arguments": {
"table": "documents",
"data": {
"title": "My Document",
"content": "Document content here",
"author": "John Doe"
}
}
}4. retrieve_data
Retrieve data from the Supabase database.
Parameters:
table(string, required): Table name to retrieve data fromfilters(object, optional): Filters to applylimit(number, optional): Maximum number of records to retrieve
Example:
{
"name": "retrieve_data",
"arguments": {
"table": "documents",
"filters": {
"author": "John Doe"
},
"limit": 10
}
}Development
Local Development Setup
Install Dependencies:
npm installStart Ollama (if not using Docker):
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull a model
ollama pull llama2
# Start Ollama
ollama serveStart Supabase (if using local instance):
# Install Supabase CLI
npm install -g supabase
# Start local Supabase
supabase startRun Development Server:
npm run devTesting
The project includes comprehensive testing:
# Run unit tests
npm test
# Run tests with coverage
npm run test:coverage
# Run E2E tests
npm run test:e2e
# Run all tests
npm run test && npm run test:e2eCode Quality
# Lint code
npm run lint
# Fix linting issues
npm run lint:fixDocker Configuration
Dockerfile
The Dockerfile creates an optimized production image:
Node.js 18 Alpine base
Non-root user for security
Health checks
Multi-stage build for smaller image size
Docker Compose
The docker-compose.yml orchestrates:
Ollama service for LLM
MCP Server
Health checks and dependencies
Volume persistence for Ollama models
Security Considerations
SQL Injection Protection: Basic sanitization of SQL queries
Environment Variables: Sensitive data stored in environment variables
Non-root Container: Docker containers run as non-root user
Input Validation: Zod schemas for input validation
Error Handling: Comprehensive error handling without information leakage
Monitoring and Logging
Log Levels
DEBUG: Detailed debugging informationINFO: General information messagesWARN: Warning messagesERROR: Error messages
Log Formats
text: Human-readable formatjson: Structured JSON format for log aggregation
Health Checks
HTTP endpoint:
GET /healthDocker health checks
Service dependency checks
Troubleshooting
Common Issues
Ollama Connection Failed
# Check if Ollama is running curl http://localhost:11434/api/tags # Restart Ollama service docker-compose restart ollamaSupabase Connection Failed
# Verify environment variables echo $SUPABASE_URL echo $SUPABASE_ANON_KEY # Test connection curl -H "Authorization: Bearer $SUPABASE_ANON_KEY" $SUPABASE_URL/rest/v1/MCP Server Not Starting
# Check logs docker-compose logs mcp-server # Check health curl http://localhost:3000/healthDocker Build Fails with "tsc: not found"
# This is fixed in the current Dockerfile # The issue was NODE_ENV=production preventing dev dependencies installation # Solution: Set NODE_ENV=development during build phase # If you still encounter issues, try: docker-compose build --no-cacheTypeScript Compilation Errors
# Test build locally first npm run build # Check for missing dependencies npm install # Clear node_modules and reinstall rm -rf node_modules package-lock.json npm install
Performance Optimization
LLM Performance
Use GPU-enabled Ollama for better performance
Adjust model parameters (temperature, max_tokens)
Consider model size vs. quality trade-offs
Database Performance
Use connection pooling
Optimize SQL queries
Consider indexing strategies
Contributing
Fork the repository
Create a feature branch
Make your changes
Add tests for new functionality
Ensure all tests pass
Submit a pull request
License
MIT License - see LICENSE file for details.
Support
For issues and questions:
Create an issue in the repository
Check the troubleshooting section
Review the test cases for usage examples
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.