Skip to main content
Glama

Self-hosted LLM MCP Server

MCP Server with Self-hosted LLM and Supabase Integration

A comprehensive Model Context Protocol (MCP) server that integrates with self-hosted LLM models via Ollama and Supabase database for data persistence and retrieval.

Features

  • MCP Protocol Support: Full implementation of the Model Context Protocol specification

  • Self-hosted LLM Integration: Support for Ollama-based LLM models (Llama2, CodeLlama, etc.)

  • Supabase Database Integration: Complete CRUD operations with Supabase

  • Docker Support: Containerized deployment with Docker Compose

  • Comprehensive Testing: Unit tests with ≥90% coverage, integration tests, and E2E tests

  • TypeScript: Fully typed implementation for better development experience

  • Logging: Structured logging with configurable levels and formats

Architecture

┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ │ MCP Client │ │ MCP Server │ │ Supabase DB │ │ │◄──►│ │◄──►│ │ └─────────────────┘ └─────────────────┘ └─────────────────┘ │ ▼ ┌─────────────────┐ │ Ollama LLM │ │ (Self-hosted) │ └─────────────────┘

Quick Start

Prerequisites

  • Docker and Docker Compose

  • Node.js 18+ (for local development)

  • Supabase account and project

1. Clone and Setup

git clone <repository-url> cd mcp-server-selfhosted cp env.example .env

2. Configure Environment

Edit .env file with your configuration:

# Supabase Configuration SUPABASE_URL=your_supabase_url_here SUPABASE_ANON_KEY=your_supabase_anon_key_here SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key_here # Self-hosted LLM Configuration LLM_BASE_URL=http://localhost:11434 LLM_MODEL=llama2 LLM_TIMEOUT=30000 # MCP Server Configuration MCP_SERVER_PORT=3000 MCP_SERVER_HOST=localhost # Logging LOG_LEVEL=info LOG_FORMAT=json

3. Start with Docker Compose

docker-compose up -d

This will start:

  • Ollama service (self-hosted LLM)

  • MCP Server

  • Health checks and monitoring

4. Verify Installation

# Check if services are running docker-compose ps # Test MCP server health curl http://localhost:3000/health # Test Ollama connection curl http://localhost:11434/api/tags

5. Test Build Locally (Optional)

# Test TypeScript compilation npm run build # Test HTTP server npm run start:http # Test health endpoint curl http://localhost:3000/health

Available Tools

The MCP server provides the following tools:

1. query_database

Execute SQL queries on the Supabase database.

Parameters:

  • query (string, required): SQL query to execute

  • table (string, optional): Table name for context

Example:

{ "name": "query_database", "arguments": { "query": "SELECT * FROM users WHERE active = true", "table": "users" } }

2. generate_text

Generate text using the self-hosted LLM.

Parameters:

  • prompt (string, required): Text prompt for the LLM

  • maxTokens (number, optional): Maximum tokens to generate

  • temperature (number, optional): Temperature for generation (0.0-1.0)

Example:

{ "name": "generate_text", "arguments": { "prompt": "Explain quantum computing in simple terms", "maxTokens": 500, "temperature": 0.7 } }

3. store_data

Store data in the Supabase database.

Parameters:

  • table (string, required): Table name to store data

  • data (object, required): Data to store

Example:

{ "name": "store_data", "arguments": { "table": "documents", "data": { "title": "My Document", "content": "Document content here", "author": "John Doe" } } }

4. retrieve_data

Retrieve data from the Supabase database.

Parameters:

  • table (string, required): Table name to retrieve data from

  • filters (object, optional): Filters to apply

  • limit (number, optional): Maximum number of records to retrieve

Example:

{ "name": "retrieve_data", "arguments": { "table": "documents", "filters": { "author": "John Doe" }, "limit": 10 } }

Development

Local Development Setup

  1. Install Dependencies:

npm install
  1. Start Ollama (if not using Docker):

# Install Ollama curl -fsSL https://ollama.ai/install.sh | sh # Pull a model ollama pull llama2 # Start Ollama ollama serve
  1. Start Supabase (if using local instance):

# Install Supabase CLI npm install -g supabase # Start local Supabase supabase start
  1. Run Development Server:

npm run dev

Testing

The project includes comprehensive testing:

# Run unit tests npm test # Run tests with coverage npm run test:coverage # Run E2E tests npm run test:e2e # Run all tests npm run test && npm run test:e2e

Code Quality

# Lint code npm run lint # Fix linting issues npm run lint:fix

Docker Configuration

Dockerfile

The Dockerfile creates an optimized production image:

  • Node.js 18 Alpine base

  • Non-root user for security

  • Health checks

  • Multi-stage build for smaller image size

Docker Compose

The docker-compose.yml orchestrates:

  • Ollama service for LLM

  • MCP Server

  • Health checks and dependencies

  • Volume persistence for Ollama models

Security Considerations

  1. SQL Injection Protection: Basic sanitization of SQL queries

  2. Environment Variables: Sensitive data stored in environment variables

  3. Non-root Container: Docker containers run as non-root user

  4. Input Validation: Zod schemas for input validation

  5. Error Handling: Comprehensive error handling without information leakage

Monitoring and Logging

Log Levels

  • DEBUG: Detailed debugging information

  • INFO: General information messages

  • WARN: Warning messages

  • ERROR: Error messages

Log Formats

  • text: Human-readable format

  • json: Structured JSON format for log aggregation

Health Checks

  • HTTP endpoint: GET /health

  • Docker health checks

  • Service dependency checks

Troubleshooting

Common Issues

  1. Ollama Connection Failed

    # Check if Ollama is running curl http://localhost:11434/api/tags # Restart Ollama service docker-compose restart ollama
  2. Supabase Connection Failed

    # Verify environment variables echo $SUPABASE_URL echo $SUPABASE_ANON_KEY # Test connection curl -H "Authorization: Bearer $SUPABASE_ANON_KEY" $SUPABASE_URL/rest/v1/
  3. MCP Server Not Starting

    # Check logs docker-compose logs mcp-server # Check health curl http://localhost:3000/health
  4. Docker Build Fails with "tsc: not found"

    # This is fixed in the current Dockerfile # The issue was NODE_ENV=production preventing dev dependencies installation # Solution: Set NODE_ENV=development during build phase # If you still encounter issues, try: docker-compose build --no-cache
  5. TypeScript Compilation Errors

    # Test build locally first npm run build # Check for missing dependencies npm install # Clear node_modules and reinstall rm -rf node_modules package-lock.json npm install

Performance Optimization

  1. LLM Performance

    • Use GPU-enabled Ollama for better performance

    • Adjust model parameters (temperature, max_tokens)

    • Consider model size vs. quality trade-offs

  2. Database Performance

    • Use connection pooling

    • Optimize SQL queries

    • Consider indexing strategies

Contributing

  1. Fork the repository

  2. Create a feature branch

  3. Make your changes

  4. Add tests for new functionality

  5. Ensure all tests pass

  6. Submit a pull request

License

MIT License - see LICENSE file for details.

Support

For issues and questions:

  • Create an issue in the repository

  • Check the troubleshooting section

  • Review the test cases for usage examples

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Enables interaction with self-hosted LLM models via Ollama and Supabase database operations. Supports text generation, SQL queries, and data storage/retrieval through natural language commands.

  1. Features
    1. Architecture
      1. Quick Start
        1. Prerequisites
        2. 1. Clone and Setup
        3. 2. Configure Environment
        4. 3. Start with Docker Compose
        5. 4. Verify Installation
        6. 5. Test Build Locally (Optional)
      2. Available Tools
        1. 1. query_database
        2. 2. generate_text
        3. 3. store_data
        4. 4. retrieve_data
      3. Development
        1. Local Development Setup
        2. Testing
        3. Code Quality
      4. Docker Configuration
        1. Dockerfile
        2. Docker Compose
      5. Security Considerations
        1. Monitoring and Logging
          1. Log Levels
          2. Log Formats
          3. Health Checks
        2. Troubleshooting
          1. Common Issues
          2. Performance Optimization
        3. Contributing
          1. License
            1. Support

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishnahuex28/MCP'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server