MCP Base
by jsmiff
# MCP Base - A Generic Model Context Protocol Framework
This folder contains a general-purpose base implementation of the Model Context Protocol (MCP) for building AI-powered applications. It provides a standardized way to create MCP servers and clients that can be used to integrate LLMs into your applications.
## š Features
- **Standardized MCP Server**: A base server implementation with support for HTTP and stdio transports
- **Generic MCP Client**: A client for connecting to any MCP server
- **Ollama Integration**: Ready-to-use services for generating embeddings and text with Ollama
- **Supabase Integration**: Built-in support for Supabase vector database
- **Modular Design**: Clearly organized structure for resources, tools, and prompts
- **Sample Templates**: Example implementations to help you get started quickly
## š ļø Directory Structure
```
_mcp-base/
āāā server.ts # Main MCP server implementation
āāā client.ts # Generic MCP client
āāā utils/ # Utility services
ā āāā ollama_embedding.ts # Embedding generation with Ollama
ā āāā ollama_text_generation.ts # Text generation with Ollama
āāā tools/ # Tool implementations
ā āāā sample-tool.ts # Example tool template
āāā resources/ # Resource implementations
ā āāā sample-resource.ts # Example resource template
āāā prompts/ # Prompt implementations
ā āāā sample-prompt.ts # Example prompt template
āāā README.md # This documentation
```
## š Getting Started
### Prerequisites
- Node.js and npm/pnpm
- Ollama for local embedding and text generation
- Supabase account for vector storage
### Environment Setup
Create a `.env` file with the following variables:
```env
PORT=3000
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_KEY=your-service-key
OLLAMA_URL=http://localhost:11434
OLLAMA_EMBED_MODEL=nomic-embed-text
OLLAMA_LLM_MODEL=llama3
SERVER_MODE=http # 'http' or 'stdio'
```
### Server Initialization
1. Import the required modules
2. Register your resources, tools, and prompts
3. Start the server
```typescript
// Import base server and utilities
import server from "./server";
import { registerSampleResources } from "./resources/sample-resource";
import { registerSampleTool } from "./tools/sample-tool";
import { registerSamplePrompts } from "./prompts/sample-prompt";
// Initialize database if needed
async function initializeDatabase() {
// Your database initialization logic
}
// Register your components
registerSampleResources(server, supabase);
registerSampleTool(server, textGenerator, embeddings, supabase);
registerSamplePrompts(server, supabase);
// Start the server
startServer();
```
### Client Usage
```typescript
import MCPClient from "./client";
// Create a client instance
const client = new MCPClient({
serverUrl: "http://localhost:3000",
});
// Example: Call a tool
async function callSampleTool() {
const result = await client.callTool("sample-tool", {
query: "example query",
maxResults: 5,
});
console.log(result);
}
// Example: Read a resource
async function readResource() {
const items = await client.readResource("items://all");
console.log(items);
}
// Example: Get a prompt
async function getPrompt() {
const prompt = await client.getPrompt("simple-prompt", {
task: "Explain quantum computing",
});
console.log(prompt);
}
// Don't forget to disconnect when done
await client.disconnect();
```
## š Extending the Framework
### Creating a New Tool
1. Create a new file in the `tools/` directory
2. Define your tool function and schema using Zod
3. Implement your tool logic
4. Register the tool in your server
### Creating a New Resource
1. Create a new file in the `resources/` directory
2. Define your resource endpoints and schemas
3. Implement your resource logic
4. Register the resource in your server
### Creating a New Prompt
1. Create a new file in the `prompts/` directory
2. Define your prompt schema and parameters
3. Implement your prompt template
4. Register the prompt in your server
## š License
MIT