MCP Base

by jsmiff
Integrations
  • Supported for environment configuration, allowing users to set up connection parameters for integrated services and server settings.

  • Provides the runtime environment for the MCP server, allowing it to execute JavaScript/TypeScript code and handle HTTP and stdio transports.

  • Provides services for generating embeddings and text with Ollama, allowing AI-powered applications to perform embedding generation and text generation operations locally.

MCP Base - A Generic Model Context Protocol Framework

This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.

📋 Features

  • Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
  • Generic MCP Client: A client for connecting to any MCP server
  • Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
  • Supabase Integration: Built-in support for Supabase vector database
  • Modular Design: Clearly organized structure for resources, tools, and prompts
  • Sample Templates: Example implementations to help you get started quickly

🛠️ Directory Structure

_mcp-base/ ├── server.ts # Main MCP server implementation ├── client.ts # Generic MCP client ├── utils/ # Utility services │ ├── ollama_embedding.ts # Embedding generation with Ollama │ └── ollama_text_generation.ts # Text generation with Ollama ├── tools/ # Tool implementations │ └── sample-tool.ts # Example tool template ├── resources/ # Resource implementations │ └── sample-resource.ts # Example resource template ├── prompts/ # Prompt implementations │ └── sample-prompt.ts # Example prompt template └── README.md # This documentation

🚀 Getting Started

Prerequisites

  • Node.js and npm/pnpm
  • Ollama for local embedding and text generation
  • Supabase account for vector storage

Environment Setup

Create a .env file with the following variables:

PORT=3000 SUPABASE_URL=https://your-project.supabase.co SUPABASE_SERVICE_KEY=your-service-key OLLAMA_URL=http://localhost:11434 OLLAMA_EMBED_MODEL=nomic-embed-text OLLAMA_LLM_MODEL=llama3 SERVER_MODE=http # 'http' or 'stdio'

Server Initialization

  1. Import the required modules
  2. Register your resources, tools, and prompts
  3. Start the server
// Import base server and utilities import server from "./server"; import { registerSampleResources } from "./resources/sample-resource"; import { registerSampleTool } from "./tools/sample-tool"; import { registerSamplePrompts } from "./prompts/sample-prompt"; // Initialize database if needed async function initializeDatabase() { // Your database initialization logic } // Register your components registerSampleResources(server, supabase); registerSampleTool(server, textGenerator, embeddings, supabase); registerSamplePrompts(server, supabase); // Start the server startServer();

Client Usage

import MCPClient from "./client"; // Create a client instance const client = new MCPClient({ serverUrl: "http://localhost:3000", }); // Example: Call a tool async function callSampleTool() { const result = await client.callTool("sample-tool", { query: "example query", maxResults: 5, }); console.log(result); } // Example: Read a resource async function readResource() { const items = await client.readResource("items://all"); console.log(items); } // Example: Get a prompt async function getPrompt() { const prompt = await client.getPrompt("simple-prompt", { task: "Explain quantum computing", }); console.log(prompt); } // Don't forget to disconnect when done await client.disconnect();

📚 Extending the Framework

Creating a New Tool

  1. Create a new file in the tools/ directory
  2. Define your tool function and schema using Zod
  3. Implement your tool logic
  4. Register the tool in your server

Creating a New Resource

  1. Create a new file in the resources/ directory
  2. Define your resource endpoints and schemas
  3. Implement your resource logic
  4. Register the resource in your server

Creating a New Prompt

  1. Create a new file in the prompts/ directory
  2. Define your prompt schema and parameters
  3. Implement your prompt template
  4. Register the prompt in your server

📄 License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.

  1. 📋 Features
    1. 🛠️ Directory Structure
      1. 🚀 Getting Started
        1. Prerequisites
        2. Environment Setup
        3. Server Initialization
        4. Client Usage
      2. 📚 Extending the Framework
        1. Creating a New Tool
        2. Creating a New Resource
        3. Creating a New Prompt
      3. 📄 License

        Related MCP Servers

        • -
          security
          A
          license
          -
          quality
          A Model Context Protocol (MCP) server implementation for the OpenLedger API. This server provides structured context to AI models according to the MCP specification.
          Last updated -
          8
          TypeScript
          Apache 2.0
        • A
          security
          A
          license
          A
          quality
          A Model Context Protocol (MCP) server that exposes the official Notion SDK, allowing AI models to interact with Notion workspaces.
          Last updated -
          17
          77
          7
          TypeScript
          Apache 2.0
          • Apple
          • Linux
        • -
          security
          A
          license
          -
          quality
          A unified Model Context Protocol server that aggregates multiple MCP servers into one, allowing AI assistants like Claude Desktop, Cursor, and Cherry Studio to connect to a single server instead of managing multiple instances.
          Last updated -
          76
          10
          TypeScript
          Apache 2.0
          • Linux
          • Apple
        • A
          security
          A
          license
          A
          quality
          A foundation for building custom local Model Context Protocol (MCP) servers that provide tools accessible to AI assistants like Cursor or Claude Desktop.
          Last updated -
          1
          9
          TypeScript
          MIT License

        View all related MCP servers

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/jsmiff/mcp'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server