Allows Hugging Face models to access file operations and other tools through the MCP client library
Supports local model integration through Ollama, providing access to file operations and other tools
Enables OpenAI models to utilize the MCP server's tools including file operations via the MCP client
MCP Server - From Scratch
A Model Context Protocol (MCP) server implementation developed from scratch. When I say scratch I literally mean, that the JSON-RPC, STDIO, and server to client connection is written by hand! This is not a simple implementation that uses the @mcp.tool functionality you see online.
This project was created to help me understand the MCP protocol and was carefully modeled after the official MCP server specification.
Project Goals
Simple Architecture: Clean, from-scratch implementation following official MCP specification
Extensible Tools: Easy to add new tools and capabilities
Learning-Focused: Well-documented code to understand MCP internals
Related MCP server: MCP Local File Reader
Architecture
Quick Start
Running the MCP Server
Integrating with Models
Claude Desktop Config
For Claude Desktop, create or edit the config file:
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
Add the following configuration toclaude_desktop_config.json:
Demo:
https://github.com/user-attachments/assets/d7e21bb3-bc6d-4b9b-8b7d-ed90f7f004fd
Available Tools
Current Tools
greeting: Returns a greeting messageread_file: Read contents of a file within allowed pathswrite_file: Write content to fileslist_directory: List files and folders in a directorycreate_directory: Create a new directory