MCP Base
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Integrations
Supported for environment configuration, allowing users to set up connection parameters for integrated services and server settings.
Provides the runtime environment for the MCP server, allowing it to execute JavaScript/TypeScript code and handle HTTP and stdio transports.
Provides services for generating embeddings and text with Ollama, allowing AI-powered applications to perform embedding generation and text generation operations locally.
MCP Base - A Generic Model Context Protocol Framework
This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.
š Features
- Standardized MCP Server: A base server implementation with support for HTTP and stdio transports
- Generic MCP Client: A client for connecting to any MCP server
- Ollama Integration: Ready-to-use services for generating embeddings and text with Ollama
- Supabase Integration: Built-in support for Supabase vector database
- Modular Design: Clearly organized structure for resources, tools, and prompts
- Sample Templates: Example implementations to help you get started quickly
š ļø Directory Structure
š Getting Started
Prerequisites
- Node.js and npm/pnpm
- Ollama for local embedding and text generation
- Supabase account for vector storage
Environment Setup
Create a .env
file with the following variables:
Server Initialization
- Import the required modules
- Register your resources, tools, and prompts
- Start the server
Client Usage
š Extending the Framework
Creating a New Tool
- Create a new file in the
tools/
directory - Define your tool function and schema using Zod
- Implement your tool logic
- Register the tool in your server
Creating a New Resource
- Create a new file in the
resources/
directory - Define your resource endpoints and schemas
- Implement your resource logic
- Register the resource in your server
Creating a New Prompt
- Create a new file in the
prompts/
directory - Define your prompt schema and parameters
- Implement your prompt template
- Register the prompt in your server
š License
MIT
This server cannot be installed
A generic Model Context Protocol framework for building AI-powered applications that provides standardized ways to create MCP servers and clients for integrating LLMs with support for Ollama and Supabase.
- š Features
- š ļø Directory Structure
- š Getting Started
- š Extending the Framework
- š License