Skip to main content
Glama

MCP Base

by jsmiff
# MCP Base - A Generic Model Context Protocol Framework This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications. ## 📋 Features - **Standardized MCP Server**: A base server implementation with support for HTTP and stdio transports - **Generic MCP Client**: A client for connecting to any MCP server - **Ollama Integration**: Ready-to-use services for generating embeddings and text with Ollama - **Supabase Integration**: Built-in support for Supabase vector database - **Modular Design**: Clearly organized structure for resources, tools, and prompts - **Sample Templates**: Example implementations to help you get started quickly ## 🛠️ Directory Structure ``` _mcp-base/ ├── server.ts # Main MCP server implementation ├── client.ts # Generic MCP client ├── utils/ # Utility services │ ├── ollama_embedding.ts # Embedding generation with Ollama │ └── ollama_text_generation.ts # Text generation with Ollama ├── tools/ # Tool implementations │ └── sample-tool.ts # Example tool template ├── resources/ # Resource implementations │ └── sample-resource.ts # Example resource template ├── prompts/ # Prompt implementations │ └── sample-prompt.ts # Example prompt template └── README.md # This documentation ``` ## 🚀 Getting Started ### Prerequisites - Node.js and npm/pnpm - Ollama for local embedding and text generation - Supabase account for vector storage ### Environment Setup Create a `.env` file with the following variables: ```env PORT=3000 SUPABASE_URL=https://your-project.supabase.co SUPABASE_SERVICE_KEY=your-service-key OLLAMA_URL=http://localhost:11434 OLLAMA_EMBED_MODEL=nomic-embed-text OLLAMA_LLM_MODEL=llama3 SERVER_MODE=http # 'http' or 'stdio' ``` ### Server Initialization 1. Import the required modules 2. Register your resources, tools, and prompts 3. Start the server ```typescript // Import base server and utilities import server from "./server"; import { registerSampleResources } from "./resources/sample-resource"; import { registerSampleTool } from "./tools/sample-tool"; import { registerSamplePrompts } from "./prompts/sample-prompt"; // Initialize database if needed async function initializeDatabase() { // Your database initialization logic } // Register your components registerSampleResources(server, supabase); registerSampleTool(server, textGenerator, embeddings, supabase); registerSamplePrompts(server, supabase); // Start the server startServer(); ``` ### Client Usage ```typescript import MCPClient from "./client"; // Create a client instance const client = new MCPClient({ serverUrl: "http://localhost:3000", }); // Example: Call a tool async function callSampleTool() { const result = await client.callTool("sample-tool", { query: "example query", maxResults: 5, }); console.log(result); } // Example: Read a resource async function readResource() { const items = await client.readResource("items://all"); console.log(items); } // Example: Get a prompt async function getPrompt() { const prompt = await client.getPrompt("simple-prompt", { task: "Explain quantum computing", }); console.log(prompt); } // Don't forget to disconnect when done await client.disconnect(); ``` ## 📚 Extending the Framework ### Creating a New Tool 1. Create a new file in the `tools/` directory 2. Define your tool function and schema using Zod 3. Implement your tool logic 4. Register the tool in your server ### Creating a New Resource 1. Create a new file in the `resources/` directory 2. Define your resource endpoints and schemas 3. Implement your resource logic 4. Register the resource in your server ### Creating a New Prompt 1. Create a new file in the `prompts/` directory 2. Define your prompt schema and parameters 3. Implement your prompt template 4. Register the prompt in your server ## 📄 License MIT

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jsmiff/mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server