Skip to main content
Glama

MCP Base - A Generic Model Context Protocol Framework

This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.

๐Ÿ“‹ Features

  • Standardized MCP Server: A base server implementation with support for HTTP and stdio transports

  • Generic MCP Client: A client for connecting to any MCP server

  • Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama

  • Supabase Integration: Built-in support for Supabase vector database

  • Modular Design: Clearly organized structure for resources, tools, and prompts

  • Sample Templates: Example implementations to help you get started quickly

Related MCP server: PAELLADOC

๐Ÿ› ๏ธ Directory Structure

_mcp-base/ โ”œโ”€โ”€ server.ts # Main MCP server implementation โ”œโ”€โ”€ client.ts # Generic MCP client โ”œโ”€โ”€ utils/ # Utility services โ”‚ โ”œโ”€โ”€ ollama_embedding.ts # Embedding generation with Ollama โ”‚ โ””โ”€โ”€ ollama_text_generation.ts # Text generation with Ollama โ”œโ”€โ”€ tools/ # Tool implementations โ”‚ โ””โ”€โ”€ sample-tool.ts # Example tool template โ”œโ”€โ”€ resources/ # Resource implementations โ”‚ โ””โ”€โ”€ sample-resource.ts # Example resource template โ”œโ”€โ”€ prompts/ # Prompt implementations โ”‚ โ””โ”€โ”€ sample-prompt.ts # Example prompt template โ””โ”€โ”€ README.md # This documentation

๐Ÿš€ Getting Started

Prerequisites

  • Node.js and npm/pnpm

  • Ollama for local embedding and text generation

  • Supabase account for vector storage

Environment Setup

Create a .env file with the following variables:

PORT=3000 SUPABASE_URL=https://your-project.supabase.co SUPABASE_SERVICE_KEY=your-service-key OLLAMA_URL=http://localhost:11434 OLLAMA_EMBED_MODEL=nomic-embed-text OLLAMA_LLM_MODEL=llama3 SERVER_MODE=http # 'http' or 'stdio'

Server Initialization

  1. Import the required modules

  2. Register your resources, tools, and prompts

  3. Start the server

// Import base server and utilities import server from "./server"; import { registerSampleResources } from "./resources/sample-resource"; import { registerSampleTool } from "./tools/sample-tool"; import { registerSamplePrompts } from "./prompts/sample-prompt"; // Initialize database if needed async function initializeDatabase() { // Your database initialization logic } // Register your components registerSampleResources(server, supabase); registerSampleTool(server, textGenerator, embeddings, supabase); registerSamplePrompts(server, supabase); // Start the server startServer();

Client Usage

import MCPClient from "./client"; // Create a client instance const client = new MCPClient({ serverUrl: "http://localhost:3000", }); // Example: Call a tool async function callSampleTool() { const result = await client.callTool("sample-tool", { query: "example query", maxResults: 5, }); console.log(result); } // Example: Read a resource async function readResource() { const items = await client.readResource("items://all"); console.log(items); } // Example: Get a prompt async function getPrompt() { const prompt = await client.getPrompt("simple-prompt", { task: "Explain quantum computing", }); console.log(prompt); } // Don't forget to disconnect when done await client.disconnect();

๐Ÿ“š Extending the Framework

Creating a New Tool

  1. Create a new file in the tools/ directory

  2. Define your tool function and schema using Zod

  3. Implement your tool logic

  4. Register the tool in your server

Creating a New Resource

  1. Create a new file in the resources/ directory

  2. Define your resource endpoints and schemas

  3. Implement your resource logic

  4. Register the resource in your server

Creating a New Prompt

  1. Create a new file in the prompts/ directory

  2. Define your prompt schema and parameters

  3. Implement your prompt template

  4. Register the prompt in your server

๐Ÿ“„ License

MIT

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jsmiff/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server