Supported for environment configuration, allowing users to set up connection parameters for integrated services and server settings.
Provides the runtime environment for the MCP server, allowing it to execute JavaScript/TypeScript code and handle HTTP and stdio transports.
Provides services for generating embeddings and text with Ollama, allowing AI-powered applications to perform embedding generation and text generation operations locally.
MCP Base - A Generic Model Context Protocol Framework
This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.
📋 Features
- Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
- Generic MCP Client: A client for connecting to any MCP server
- Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
- Supabase Integration: Built-in support for Supabase vector database
- Modular Design: Clearly organized structure for resources, tools, and prompts
- Sample Templates: Example implementations to help you get started quickly
🛠️ Directory Structure
🚀 Getting Started
Prerequisites
- Node.js and npm/pnpm
- Ollama for local embedding and text generation
- Supabase account for vector storage
Environment Setup
Create a .env
file with the following variables:
Server Initialization
- Import the required modules
- Register your resources, tools, and prompts
- Start the server
Client Usage
📚 Extending the Framework
Creating a New Tool
- Create a new file in the
tools/
directory - Define your tool function and schema using Zod
- Implement your tool logic
- Register the tool in your server
Creating a New Resource
- Create a new file in the
resources/
directory - Define your resource endpoints and schemas
- Implement your resource logic
- Register the resource in your server
Creating a New Prompt
- Create a new file in the
prompts/
directory - Define your prompt schema and parameters
- Implement your prompt template
- Register the prompt in your server
📄 License
MIT
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
Related Resources
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server implementation for the OpenLedger API. This server provides structured context to AI models according to the MCP specification.Last updated -8TypeScriptApache 2.0
- AsecurityAlicenseAqualityA Model Context Protocol (MCP) server that exposes the official Notion SDK, allowing AI models to interact with Notion workspaces.Last updated -17777TypeScriptApache 2.0
- -securityAlicense-qualityA unified Model Context Protocol server that aggregates multiple MCP servers into one, allowing AI assistants like Claude Desktop, Cursor, and Cherry Studio to connect to a single server instead of managing multiple instances.Last updated -7610TypeScriptApache 2.0
- AsecurityAlicenseAqualityA foundation for building custom local Model Context Protocol (MCP) servers that provide tools accessible to AI assistants like Cursor or Claude Desktop.Last updated -19TypeScriptMIT License