Skip to main content
Glama

Universal MCP Server

by akshayarav

Universal MCP Server

A model-agnostic Model Context Protocol (MCP) server implementation that works with any compatible AI model or client, not just Claude Desktop.

🎯 Project Goals

  • Universal Compatibility: Works with any model that supports MCP (Claude, local models via Hugging Face, OpenAI, etc.)
  • Simple Architecture: Clean, from-scratch implementation following official MCP specification
  • Extensible Tools: Easy to add new tools and capabilities
  • Learning-Focused: Well-documented code to understand MCP internals

📋 Project Scope

Phase 1: Core MCP Server

  • JSON-RPC 2.0 over stdio communication
  • Basic MCP protocol methods (initialize, tools/list, tools/call)
  • File reading tool for specified directories
  • Error handling and validation
  • Configuration via command line/config file

Phase 2: Tool Expansion

  • File writing capabilities
  • Directory listing and navigation
  • Text processing tools (search, replace, etc.)
  • System information tools
  • Custom tool plugin system

Phase 3: Multi-Model Client

  • Generic MCP client library
  • Hugging Face model integration
  • OpenAI API integration
  • Local model support (Ollama, etc.)
  • Web interface for testing

🏗️ Architecture

┌─────────────────┐ JSON-RPC ┌─────────────────┐ │ AI Model │ ◄──────────────► │ MCP Server │ │ (Any Provider) │ (stdio) │ (Python) │ └─────────────────┘ └─────────────────┘ │ ▼ ┌─────────────────┐ │ Tools │ │ • File Reader │ │ • File Writer │ │ • Directory Ops │ └─────────────────┘

🚀 Quick Start

Running the MCP Server

# Install dependencies pip install -r requirements.txt # Run the server (communicates via stdio) python mcp_server.py --allowed-paths ./data ./documents # Test with a simple echo echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | python mcp_server.py

Integrating with Models

Claude Desktop
{ "mcpServers": { "file-tools": { "command": "python", "args": ["path/to/mcp_server.py", "--allowed-paths", "./data"] } } }
Hugging Face Models
from mcp_client import MCPClient from transformers import pipeline # Initialize your model model = pipeline("text-generation", model="microsoft/DialoGPT-medium") # Connect to MCP server mcp_client = MCPClient("python mcp_server.py") # Use tools through the model response = model("Can you read the file data/example.txt?") tool_result = mcp_client.call_tool("read_file", {"path": "data/example.txt"})

🛠️ Available Tools

File Operations

  • read_file: Read contents of a file within allowed paths
  • list_directory: List files and folders in a directory
  • file_info: Get file metadata (size, modified date, etc.)

Planned Tools

  • write_file: Write content to files
  • search_files: Search for text within files
  • execute_command: Run system commands (with safety restrictions)

📁 Project Structure

universal-mcp-server/ ├── mcp_server.py # Main MCP server implementation ├── mcp_client.py # Generic client for any model ├── tools/ │ ├── __init__.py │ ├── file_tools.py # File operation tools │ └── system_tools.py # System information tools ├── examples/ │ ├── huggingface_client.py │ ├── openai_client.py │ └── test_tools.py ├── config/ │ └── server_config.yaml ├── requirements.txt └── README.md

🔧 Configuration

Server Configuration (config/server_config.yaml)

server: name: "Universal File Tools" version: "1.0.0" security: allowed_paths: - "./data" - "./documents" max_file_size: "10MB" tools: file_reader: enabled: true file_writer: enabled: false # Disabled by default for security

Command Line Options

python mcp_server.py \ --config config/server_config.yaml \ --allowed-paths ./data ./docs \ --max-file-size 5MB \ --log-level INFO

🧪 Testing

Unit Tests

python -m pytest tests/

Manual Testing

# Test tool listing echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | python mcp_server.py # Test file reading echo '{"jsonrpc":"2.0","method":"tools/call","params":{"name":"read_file","arguments":{"path":"data/test.txt"}},"id":2}' | python mcp_server.py

Integration Tests

# Test with different models python examples/test_huggingface.py python examples/test_openai.py
-
security - not tested
F
license - not found
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

A model-agnostic Model Context Protocol server implementation that works with any compatible AI model or client, allowing tools like file operations to be accessed through the MCP standard.

  1. 🎯 Project Goals
    1. 📋 Project Scope
      1. Phase 1: Core MCP Server
      2. Phase 2: Tool Expansion
      3. Phase 3: Multi-Model Client
    2. 🏗️ Architecture
      1. 🚀 Quick Start
        1. Running the MCP Server
        2. Integrating with Models
      2. 🛠️ Available Tools
        1. File Operations
        2. Planned Tools
      3. 📁 Project Structure
        1. 🔧 Configuration
          1. Server Configuration (config/server_config.yaml)
          2. Command Line Options
        2. 🧪 Testing
          1. Unit Tests
          2. Manual Testing
          3. Integration Tests

        Related MCP Servers

        • -
          security
          F
          license
          -
          quality
          A Model Context Protocol server implementation that enables connection between OpenAI APIs and MCP clients for coding assistance with features like CLI interaction, web API integration, and tool-based architecture.
          Last updated -
          33
          Python
          • Linux
          • Apple
        • -
          security
          A
          license
          -
          quality
          A Model Context Protocol (MCP) server that allows AI models to safely access and interact with local file systems, enabling reading file contents, listing directories, and retrieving file metadata.
          Last updated -
          13
          9
          JavaScript
          MIT License
          • Linux
        • A
          security
          F
          license
          A
          quality
          An all-in-one Model Context Protocol (MCP) server that connects your coding AI to numerous databases, data warehouses, data pipelines, and cloud services, streamlining development workflow through seamless integrations.
          Last updated -
          2
          Python
          • Apple
          • Linux
        • A
          security
          F
          license
          A
          quality
          A Model Context Protocol (MCP) server that enables AI assistants to perform comprehensive file operations including finding, reading, writing, editing, searching, moving, and copying files with security validations.
          Last updated -
          7
          TypeScript

        View all related MCP servers

        MCP directory API

        We provide all the information about MCP servers via our MCP API.

        curl -X GET 'https://glama.ai/api/mcp/v1/servers/akshayarav/MCP'

        If you have feedback or need assistance with the MCP directory API, please join our Discord server