MCP File Reader
Integrates with Ollama's local LLM runtime to provide AI-powered summarization of file contents, enabling offline processing of text files through the Mistral model without requiring cloud APIs or internet connectivity.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP File Readersummarize the project_ideas.txt file in my_files"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP File Reader
A fully local, free AI-powered file reader that uses MCP (Model Context Protocol) to connect a Python tool server with a local LLM (Ollama + Mistral). No API keys, no cloud, no cost โ runs entirely on your machine.
๐งพ Features
MCP Server with custom tools (
list_files,read_file)Local LLM integration via Ollama (Mistral / Llama3.2)
Automatic file discovery and content reading
AI-powered summarization of file contents
Path traversal protection (security built-in)
Zero cost โ no API keys or subscriptions required
๐ ๏ธ Tech Stack
Technology | Description |
Python | Core language for server and client |
MCP | Model Context Protocol (tool server) |
Ollama | Local LLM runtime (free, offline) |
Mistral | Local AI model for summarization |
asyncio | Async communication between client/server |
requests | HTTP calls to Ollama API |
๐ฆ Installation
git clone https://github.com/JaneKarunyaJ/MCP-File-Reader.git
cd MCP-File-Reader
pip install mcp requestsInstall Ollama from https://ollama.com, then pull the model:
ollama pull mistral๐ Usage
Make sure Ollama is running (it starts automatically after installation), then:
python client.pyThe client will:
Launch the MCP server as a subprocess
Call
list_filesto discover files inmy_files/Call
read_filefor each file foundSend the real content to Mistral for summarization
๐ Project Structure
MCP-File-Reader/
โ
โโโ server.py # MCP server โ exposes list_files and read_file tools
โโโ client.py # MCP client โ calls tools and queries Ollama
โโโ requirements.txt # Python dependencies
โโโ my_files/ # Folder the AI is allowed to read
โโโ project_ideas.txt
โโโ wishlist.txt๐ Security
The MCP server only allows reads from the
my_files/directoryPath traversal attacks (e.g.
../../etc/passwd) are automatically blockedNo data leaves your machine โ fully offline after setup
๐ง How It Works
client.py
โ
โโโ Step 1: Calls MCP tool โ list_files()
โ โ
โ Returns filenames from my_files/
โ
โโโ Step 2: Calls MCP tool โ read_file(filename)
โ โ
โ Returns actual file contents
โ
โโโ Step 3: Sends real content to Ollama (Mistral)
โ
Returns AI summaryโ Extending the Project
Add your own files: Drop any
.txtfile intomy_files/and run againAdd new tools: Add a new tool handler in
server.py(e.g.search_in_file,write_file)Change the question: Edit
user_questioninclient.pyto ask anything about your filesSwap the model: Change
MODEL = "mistral"inclient.pyto any model you have pulled in Ollama
๐ Requirements
Python 3.9+
Ollama installed (ollama.com)
Mistral model pulled (
ollama pull mistral)mcpandrequestsPython packages
This server cannot be installed
Resources
Unclaimed servers have limited discoverability.
Looking for Admin?
If you are the server author, to access and configure the admin panel.
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/JaneKarunyaJ/MCP'
If you have feedback or need assistance with the MCP directory API, please join our Discord server