MCP Base

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Integrations

  • Supported for environment configuration, allowing users to set up connection parameters for integrated services and server settings.

  • Provides the runtime environment for the MCP server, allowing it to execute JavaScript/TypeScript code and handle HTTP and stdio transports.

  • Provides services for generating embeddings and text with Ollama, allowing AI-powered applications to perform embedding generation and text generation operations locally.

MCP Base - A Generic Model Context Protocol Framework

This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.

šŸ“‹ Features

  • Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
  • Generic MCP Client: A client for connecting to any MCP server
  • Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
  • Supabase Integration: Built-in support for Supabase vector database
  • Modular Design: Clearly organized structure for resources, tools, and prompts
  • Sample Templates: Example implementations to help you get started quickly

šŸ› ļø Directory Structure

_mcp-base/ ā”œā”€ā”€ server.ts # Main MCP server implementation ā”œā”€ā”€ client.ts # Generic MCP client ā”œā”€ā”€ utils/ # Utility services ā”‚ ā”œā”€ā”€ ollama_embedding.ts # Embedding generation with Ollama ā”‚ ā””ā”€ā”€ ollama_text_generation.ts # Text generation with Ollama ā”œā”€ā”€ tools/ # Tool implementations ā”‚ ā””ā”€ā”€ sample-tool.ts # Example tool template ā”œā”€ā”€ resources/ # Resource implementations ā”‚ ā””ā”€ā”€ sample-resource.ts # Example resource template ā”œā”€ā”€ prompts/ # Prompt implementations ā”‚ ā””ā”€ā”€ sample-prompt.ts # Example prompt template ā””ā”€ā”€ README.md # This documentation

šŸš€ Getting Started

Prerequisites

  • Node.js and npm/pnpm
  • Ollama for local embedding and text generation
  • Supabase account for vector storage

Environment Setup

Create a .env file with the following variables:

PORT=3000 SUPABASE_URL=https://your-project.supabase.co SUPABASE_SERVICE_KEY=your-service-key OLLAMA_URL=http://localhost:11434 OLLAMA_EMBED_MODEL=nomic-embed-text OLLAMA_LLM_MODEL=llama3 SERVER_MODE=http # 'http' or 'stdio'

Server Initialization

  1. Import the required modules
  2. Register your resources, tools, and prompts
  3. Start the server
// Import base server and utilities import server from "./server"; import { registerSampleResources } from "./resources/sample-resource"; import { registerSampleTool } from "./tools/sample-tool"; import { registerSamplePrompts } from "./prompts/sample-prompt"; // Initialize database if needed async function initializeDatabase() { // Your database initialization logic } // Register your components registerSampleResources(server, supabase); registerSampleTool(server, textGenerator, embeddings, supabase); registerSamplePrompts(server, supabase); // Start the server startServer();

Client Usage

import MCPClient from "./client"; // Create a client instance const client = new MCPClient({ serverUrl: "http://localhost:3000", }); // Example: Call a tool async function callSampleTool() { const result = await client.callTool("sample-tool", { query: "example query", maxResults: 5, }); console.log(result); } // Example: Read a resource async function readResource() { const items = await client.readResource("items://all"); console.log(items); } // Example: Get a prompt async function getPrompt() { const prompt = await client.getPrompt("simple-prompt", { task: "Explain quantum computing", }); console.log(prompt); } // Don't forget to disconnect when done await client.disconnect();

šŸ“š Extending the Framework

Creating a New Tool

  1. Create a new file in the tools/ directory
  2. Define your tool function and schema using Zod
  3. Implement your tool logic
  4. Register the tool in your server

Creating a New Resource

  1. Create a new file in the resources/ directory
  2. Define your resource endpoints and schemas
  3. Implement your resource logic
  4. Register the resource in your server

Creating a New Prompt

  1. Create a new file in the prompts/ directory
  2. Define your prompt schema and parameters
  3. Implement your prompt template
  4. Register the prompt in your server

šŸ“„ License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.

  1. šŸ“‹ Features
    1. šŸ› ļø Directory Structure
      1. šŸš€ Getting Started
        1. Prerequisites
          1. Environment Setup
            1. Server Initialization
              1. Client Usage
              2. šŸ“š Extending the Framework
                1. Creating a New Tool
                  1. Creating a New Resource
                    1. Creating a New Prompt
                    2. šŸ“„ License